Mar 09 09:07:13 crc systemd[1]: Starting Kubernetes Kubelet... Mar 09 09:07:13 crc restorecon[4651]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:13 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:07:14 crc restorecon[4651]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:07:14 crc restorecon[4651]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 09 09:07:15 crc kubenswrapper[4792]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 09:07:15 crc kubenswrapper[4792]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 09 09:07:15 crc kubenswrapper[4792]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 09:07:15 crc kubenswrapper[4792]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 09:07:15 crc kubenswrapper[4792]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 09 09:07:15 crc kubenswrapper[4792]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.419440 4792 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422859 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422875 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422879 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422882 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422886 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422889 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422893 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422897 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422901 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422904 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422907 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422911 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422914 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422918 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422928 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422934 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422938 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422942 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422946 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422950 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422953 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422958 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422963 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422967 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422971 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422975 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422979 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422982 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422986 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422989 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422993 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.422996 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423000 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423003 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423006 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423010 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423013 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423016 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423020 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423024 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423027 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423033 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423037 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423043 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423047 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423052 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423056 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423060 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423079 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423083 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423088 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423093 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423098 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423102 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423106 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423111 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423115 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423120 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423124 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423128 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423132 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423135 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423139 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423142 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423146 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423149 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423152 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423156 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423159 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423163 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.423166 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424088 4792 flags.go:64] FLAG: --address="0.0.0.0" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424100 4792 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424107 4792 flags.go:64] FLAG: --anonymous-auth="true" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424112 4792 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424117 4792 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424121 4792 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424127 4792 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424132 4792 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424137 4792 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424142 4792 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424146 4792 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424150 4792 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424154 4792 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424158 4792 flags.go:64] FLAG: --cgroup-root="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424162 4792 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424166 4792 flags.go:64] FLAG: --client-ca-file="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424170 4792 flags.go:64] FLAG: --cloud-config="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424174 4792 flags.go:64] FLAG: --cloud-provider="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424177 4792 flags.go:64] FLAG: --cluster-dns="[]" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424182 4792 flags.go:64] FLAG: --cluster-domain="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424186 4792 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424190 4792 flags.go:64] FLAG: --config-dir="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424194 4792 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424198 4792 flags.go:64] FLAG: --container-log-max-files="5" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424204 4792 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424208 4792 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424212 4792 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424216 4792 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424220 4792 flags.go:64] FLAG: --contention-profiling="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424225 4792 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424230 4792 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424235 4792 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424239 4792 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424243 4792 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424247 4792 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424252 4792 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424255 4792 flags.go:64] FLAG: --enable-load-reader="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424260 4792 flags.go:64] FLAG: --enable-server="true" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424263 4792 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424268 4792 flags.go:64] FLAG: --event-burst="100" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424272 4792 flags.go:64] FLAG: --event-qps="50" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424276 4792 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424280 4792 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424284 4792 flags.go:64] FLAG: --eviction-hard="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424290 4792 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424294 4792 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424298 4792 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424302 4792 flags.go:64] FLAG: --eviction-soft="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424306 4792 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424310 4792 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424314 4792 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424317 4792 flags.go:64] FLAG: --experimental-mounter-path="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424321 4792 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424325 4792 flags.go:64] FLAG: --fail-swap-on="true" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424329 4792 flags.go:64] FLAG: --feature-gates="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424334 4792 flags.go:64] FLAG: --file-check-frequency="20s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424338 4792 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424342 4792 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424347 4792 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424351 4792 flags.go:64] FLAG: --healthz-port="10248" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424355 4792 flags.go:64] FLAG: --help="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424359 4792 flags.go:64] FLAG: --hostname-override="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424363 4792 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424368 4792 flags.go:64] FLAG: --http-check-frequency="20s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424372 4792 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424375 4792 flags.go:64] FLAG: --image-credential-provider-config="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424379 4792 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424383 4792 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424387 4792 flags.go:64] FLAG: --image-service-endpoint="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424391 4792 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424395 4792 flags.go:64] FLAG: --kube-api-burst="100" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424399 4792 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424403 4792 flags.go:64] FLAG: --kube-api-qps="50" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424407 4792 flags.go:64] FLAG: --kube-reserved="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424411 4792 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424414 4792 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424419 4792 flags.go:64] FLAG: --kubelet-cgroups="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424422 4792 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424426 4792 flags.go:64] FLAG: --lock-file="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424430 4792 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424435 4792 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424439 4792 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424445 4792 flags.go:64] FLAG: --log-json-split-stream="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424449 4792 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424453 4792 flags.go:64] FLAG: --log-text-split-stream="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424457 4792 flags.go:64] FLAG: --logging-format="text" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424462 4792 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424466 4792 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424470 4792 flags.go:64] FLAG: --manifest-url="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424474 4792 flags.go:64] FLAG: --manifest-url-header="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424480 4792 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424483 4792 flags.go:64] FLAG: --max-open-files="1000000" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424489 4792 flags.go:64] FLAG: --max-pods="110" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424493 4792 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424497 4792 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424501 4792 flags.go:64] FLAG: --memory-manager-policy="None" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424504 4792 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424508 4792 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424512 4792 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424517 4792 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424526 4792 flags.go:64] FLAG: --node-status-max-images="50" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424530 4792 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424534 4792 flags.go:64] FLAG: --oom-score-adj="-999" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424538 4792 flags.go:64] FLAG: --pod-cidr="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424542 4792 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424549 4792 flags.go:64] FLAG: --pod-manifest-path="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424552 4792 flags.go:64] FLAG: --pod-max-pids="-1" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424556 4792 flags.go:64] FLAG: --pods-per-core="0" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424560 4792 flags.go:64] FLAG: --port="10250" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424564 4792 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424568 4792 flags.go:64] FLAG: --provider-id="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424573 4792 flags.go:64] FLAG: --qos-reserved="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424577 4792 flags.go:64] FLAG: --read-only-port="10255" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424581 4792 flags.go:64] FLAG: --register-node="true" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424584 4792 flags.go:64] FLAG: --register-schedulable="true" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424589 4792 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424596 4792 flags.go:64] FLAG: --registry-burst="10" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424600 4792 flags.go:64] FLAG: --registry-qps="5" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424604 4792 flags.go:64] FLAG: --reserved-cpus="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424608 4792 flags.go:64] FLAG: --reserved-memory="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424613 4792 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424617 4792 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424620 4792 flags.go:64] FLAG: --rotate-certificates="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424625 4792 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424628 4792 flags.go:64] FLAG: --runonce="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424632 4792 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424636 4792 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424651 4792 flags.go:64] FLAG: --seccomp-default="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424655 4792 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424659 4792 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424663 4792 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424667 4792 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424671 4792 flags.go:64] FLAG: --storage-driver-password="root" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424675 4792 flags.go:64] FLAG: --storage-driver-secure="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424679 4792 flags.go:64] FLAG: --storage-driver-table="stats" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424683 4792 flags.go:64] FLAG: --storage-driver-user="root" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424687 4792 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424691 4792 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424695 4792 flags.go:64] FLAG: --system-cgroups="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424699 4792 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424705 4792 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424709 4792 flags.go:64] FLAG: --tls-cert-file="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424713 4792 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424718 4792 flags.go:64] FLAG: --tls-min-version="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424722 4792 flags.go:64] FLAG: --tls-private-key-file="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424726 4792 flags.go:64] FLAG: --topology-manager-policy="none" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424730 4792 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424734 4792 flags.go:64] FLAG: --topology-manager-scope="container" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424738 4792 flags.go:64] FLAG: --v="2" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424744 4792 flags.go:64] FLAG: --version="false" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424749 4792 flags.go:64] FLAG: --vmodule="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424754 4792 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.424759 4792 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425010 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425018 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425022 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425026 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425030 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425033 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425037 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425045 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425049 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425052 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425056 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425086 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425090 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425094 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425097 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425101 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425104 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425111 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425115 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425121 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425157 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425161 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425165 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425168 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425172 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425176 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425179 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425183 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425188 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425193 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425198 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425204 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425208 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425212 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425216 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425221 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425226 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425230 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425234 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425238 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425241 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425365 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425370 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425373 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425377 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425381 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425384 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425388 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425392 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425397 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425400 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425404 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425407 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425411 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425417 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425421 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425424 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425428 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425431 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425436 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425440 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425444 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425448 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425452 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425456 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425460 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425466 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425469 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425473 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425476 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.425480 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.425492 4792 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.435660 4792 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.435692 4792 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435766 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435775 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435780 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435785 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435790 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435794 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435798 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435802 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435806 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435810 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435814 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435818 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435821 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435825 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435829 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435833 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435837 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435841 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435845 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435849 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435852 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435857 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435862 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435866 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435871 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435875 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435879 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435883 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435887 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435891 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435895 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435898 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435902 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435906 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435909 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435913 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435917 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435922 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435928 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435932 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435936 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435940 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435944 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435948 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435951 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435955 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435959 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435963 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435966 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435970 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435974 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435979 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435984 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435988 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435992 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.435996 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436002 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436006 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436010 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436015 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436033 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436037 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436042 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436046 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436051 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436055 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436059 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436063 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436086 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436092 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436096 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.436102 4792 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436237 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436244 4792 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436248 4792 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436253 4792 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436257 4792 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436261 4792 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436267 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436271 4792 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436275 4792 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436279 4792 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436283 4792 feature_gate.go:330] unrecognized feature gate: Example Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436287 4792 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436291 4792 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436294 4792 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436298 4792 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436302 4792 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436306 4792 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436310 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436314 4792 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436318 4792 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436322 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436326 4792 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436330 4792 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436334 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436338 4792 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436342 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436346 4792 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436350 4792 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436354 4792 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436358 4792 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436361 4792 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436365 4792 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436369 4792 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436372 4792 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436388 4792 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436392 4792 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436396 4792 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436401 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436405 4792 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436409 4792 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436413 4792 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436417 4792 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436422 4792 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436427 4792 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436432 4792 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436437 4792 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436441 4792 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436445 4792 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436449 4792 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436453 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436457 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436462 4792 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436466 4792 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436470 4792 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436475 4792 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436478 4792 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436483 4792 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436486 4792 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436491 4792 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436495 4792 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436498 4792 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436502 4792 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436506 4792 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436510 4792 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436514 4792 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436518 4792 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436522 4792 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436526 4792 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436531 4792 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436535 4792 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.436540 4792 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.436546 4792 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.437405 4792 server.go:940] "Client rotation is on, will bootstrap in background" Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.440259 4792 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.445422 4792 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.445526 4792 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.447401 4792 server.go:997] "Starting client certificate rotation" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.447435 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.447680 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.474315 4792 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.476310 4792 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.477502 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.492029 4792 log.go:25] "Validated CRI v1 runtime API" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.537192 4792 log.go:25] "Validated CRI v1 image API" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.541694 4792 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.547839 4792 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-09-09-00-27-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.547886 4792 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.567729 4792 manager.go:217] Machine: {Timestamp:2026-03-09 09:07:15.564803728 +0000 UTC m=+0.595004520 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199476736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:838abbcf-5467-42bb-9eb7-be30fe4962bb BootID:e3b5ac96-f3df-45c5-a4ac-24aa5703690c Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599738368 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ab:91:37 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ab:91:37 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:67:22:06 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:59:54:2a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:8a:60:3b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:29:8b:e2 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:50:24:a1 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4e:6b:bb:6e:69:45 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:2a:16:41:fe:fc:f0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199476736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.568029 4792 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.568310 4792 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.569662 4792 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.569891 4792 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.569934 4792 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.570208 4792 topology_manager.go:138] "Creating topology manager with none policy" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.570220 4792 container_manager_linux.go:303] "Creating device plugin manager" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.570737 4792 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.570774 4792 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.571721 4792 state_mem.go:36] "Initialized new in-memory state store" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.571825 4792 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.579723 4792 kubelet.go:418] "Attempting to sync node with API server" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.579760 4792 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.579847 4792 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.579869 4792 kubelet.go:324] "Adding apiserver pod source" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.579887 4792 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.585158 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.585221 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.585653 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.585978 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.586227 4792 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.587111 4792 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.589477 4792 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.590859 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.590880 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.590887 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.590895 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.590908 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.590915 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.590923 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.590935 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.590945 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.590953 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.590977 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.590984 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.591959 4792 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.592362 4792 server.go:1280] "Started kubelet" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.593453 4792 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.593960 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.593449 4792 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 09 09:07:15 crc systemd[1]: Started Kubernetes Kubelet. Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.594600 4792 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.595121 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.595197 4792 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.595719 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.596345 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.595778 4792 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.596504 4792 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.595790 4792 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.596469 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.597004 4792 server.go:460] "Adding debug handlers to kubelet server" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.605825 4792 factory.go:55] Registering systemd factory Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.605856 4792 factory.go:221] Registration of the systemd container factory successfully Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.607013 4792 factory.go:153] Registering CRI-O factory Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.607057 4792 factory.go:221] Registration of the crio container factory successfully Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.607146 4792 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.607175 4792 factory.go:103] Registering Raw factory Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.607249 4792 manager.go:1196] Started watching for new ooms in manager Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.608057 4792 manager.go:319] Starting recovery of all containers Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.606258 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.200:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b211117781d84 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.592330628 +0000 UTC m=+0.622531380,LastTimestamp:2026-03-09 09:07:15.592330628 +0000 UTC m=+0.622531380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.599963 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="200ms" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620473 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620533 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620591 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620612 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620630 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620647 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620662 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620683 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620701 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620716 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620731 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620747 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620766 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620794 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620811 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620829 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620845 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620858 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620873 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620888 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620906 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620920 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620935 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620947 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.620987 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621002 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621023 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621040 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621058 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621090 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621104 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621118 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621138 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621153 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621169 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621183 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621197 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621209 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621221 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.621235 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.623887 4792 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.623926 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.623946 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.623963 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.623980 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.623999 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624017 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624032 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624047 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624061 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624116 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624133 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624150 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624169 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624184 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624199 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624214 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624229 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624244 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624259 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624276 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624292 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624307 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624323 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624337 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624351 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624367 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624382 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624398 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624413 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624431 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624448 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624463 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624492 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624509 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624521 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624535 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624548 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624562 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624577 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624594 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624609 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624624 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624638 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624653 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624668 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624683 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624699 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624715 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624729 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624743 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624757 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624772 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624786 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624801 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624817 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624833 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624848 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624861 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624876 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624889 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624903 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624918 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624932 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624949 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624969 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.624985 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625001 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625016 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625034 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625051 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625118 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625137 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625154 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625169 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625183 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625201 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625218 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625236 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625255 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625269 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625283 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625296 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625310 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625322 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625335 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625348 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625364 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625387 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625406 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625419 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625435 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625449 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625463 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625479 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625494 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625508 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625524 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625539 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625555 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625569 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625617 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625631 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625645 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625661 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625675 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625689 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625703 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625742 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625758 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625773 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625787 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625803 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625818 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625834 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625847 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625861 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625873 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625887 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625902 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625918 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625932 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625947 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625962 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625974 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.625992 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626009 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626025 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626040 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626054 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626090 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626106 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626121 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626135 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626149 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626163 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626179 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626197 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626213 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626228 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626242 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626259 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626273 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626288 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626302 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626316 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626328 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626341 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626360 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626376 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626393 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626411 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626426 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626441 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626454 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626468 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626483 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626497 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626514 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626531 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626550 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626564 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626579 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626593 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626606 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626621 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626635 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626651 4792 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626666 4792 reconstruct.go:97] "Volume reconstruction finished" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.626676 4792 reconciler.go:26] "Reconciler: start to sync state" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.632510 4792 manager.go:324] Recovery completed Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.645032 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.651403 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.651445 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.651457 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.652127 4792 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.652145 4792 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.652166 4792 state_mem.go:36] "Initialized new in-memory state store" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.658796 4792 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.661033 4792 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.661087 4792 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.661121 4792 kubelet.go:2335] "Starting kubelet main sync loop" Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.661239 4792 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 09 09:07:15 crc kubenswrapper[4792]: W0309 09:07:15.664385 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.664453 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.686791 4792 policy_none.go:49] "None policy: Start" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.687805 4792 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.687830 4792 state_mem.go:35] "Initializing new in-memory state store" Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.697236 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.748210 4792 manager.go:334] "Starting Device Plugin manager" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.748256 4792 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.748269 4792 server.go:79] "Starting device plugin registration server" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.748683 4792 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.748701 4792 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.748924 4792 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.748997 4792 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.749006 4792 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.756147 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.761703 4792 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.761841 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.762941 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.763010 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.763021 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.763318 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.763356 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.763869 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.767359 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.767378 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.767398 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.767415 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.767417 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.767446 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.767815 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.767887 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.767890 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.771029 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.771060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.771105 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.771151 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.771185 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.771204 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.771399 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.771495 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.771533 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.773361 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.773407 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.773431 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.773626 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.773724 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.773752 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.774471 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.774500 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.774515 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.774524 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.774545 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.774558 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.775226 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.775257 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.775288 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.775457 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.775487 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.777619 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.777669 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.777690 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.809463 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="400ms" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.829216 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.829375 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.829456 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.829536 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.829603 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.829728 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.829806 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.829876 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.829945 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.830035 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.830141 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.830217 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.830293 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.830361 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.830434 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.849085 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.850565 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.850625 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.850642 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.850673 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:07:15 crc kubenswrapper[4792]: E0309 09:07:15.851127 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.200:6443: connect: connection refused" node="crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.932013 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.932956 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.933039 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.933195 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.933305 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.933437 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.933547 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.933663 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.933237 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.933504 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.933384 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.932203 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.933874 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.933681 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.934118 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.934260 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.934376 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.934445 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.934189 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.934325 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.934651 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.934751 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.934851 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.934931 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.934752 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.934783 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.935095 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.935175 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.935278 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:15 crc kubenswrapper[4792]: I0309 09:07:15.935374 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.051811 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.054401 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.054463 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.054484 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.054524 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:07:16 crc kubenswrapper[4792]: E0309 09:07:16.055042 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.200:6443: connect: connection refused" node="crc" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.098990 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.104301 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.123448 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.136911 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.142469 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:07:16 crc kubenswrapper[4792]: W0309 09:07:16.152546 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-0990a7ba9c90b43708e40abcd9c8a27b58d150b64bd5889fb0a84ae9ac40daf8 WatchSource:0}: Error finding container 0990a7ba9c90b43708e40abcd9c8a27b58d150b64bd5889fb0a84ae9ac40daf8: Status 404 returned error can't find the container with id 0990a7ba9c90b43708e40abcd9c8a27b58d150b64bd5889fb0a84ae9ac40daf8 Mar 09 09:07:16 crc kubenswrapper[4792]: W0309 09:07:16.154876 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e05a47745682c9d9941c78db389c7a87e3dcf9119eb589f8c2b9cc90508b0115 WatchSource:0}: Error finding container e05a47745682c9d9941c78db389c7a87e3dcf9119eb589f8c2b9cc90508b0115: Status 404 returned error can't find the container with id e05a47745682c9d9941c78db389c7a87e3dcf9119eb589f8c2b9cc90508b0115 Mar 09 09:07:16 crc kubenswrapper[4792]: W0309 09:07:16.166274 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-1b7fd0fd13657af5c26d50a03d21d15f026655084104ac797a0bf8617c11a408 WatchSource:0}: Error finding container 1b7fd0fd13657af5c26d50a03d21d15f026655084104ac797a0bf8617c11a408: Status 404 returned error can't find the container with id 1b7fd0fd13657af5c26d50a03d21d15f026655084104ac797a0bf8617c11a408 Mar 09 09:07:16 crc kubenswrapper[4792]: W0309 09:07:16.167626 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-62d345a9d20cc01e5e1b28d3b80c8286988b24a90131932e273e1bb64a66206e WatchSource:0}: Error finding container 62d345a9d20cc01e5e1b28d3b80c8286988b24a90131932e273e1bb64a66206e: Status 404 returned error can't find the container with id 62d345a9d20cc01e5e1b28d3b80c8286988b24a90131932e273e1bb64a66206e Mar 09 09:07:16 crc kubenswrapper[4792]: W0309 09:07:16.170380 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-7059712e162aa8ef8ead4ad77c0af05905c99e531c073e3722b25539c218a116 WatchSource:0}: Error finding container 7059712e162aa8ef8ead4ad77c0af05905c99e531c073e3722b25539c218a116: Status 404 returned error can't find the container with id 7059712e162aa8ef8ead4ad77c0af05905c99e531c073e3722b25539c218a116 Mar 09 09:07:16 crc kubenswrapper[4792]: E0309 09:07:16.211387 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="800ms" Mar 09 09:07:16 crc kubenswrapper[4792]: W0309 09:07:16.442752 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 09 09:07:16 crc kubenswrapper[4792]: E0309 09:07:16.442853 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.455525 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.456947 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.457026 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.457047 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.457114 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:07:16 crc kubenswrapper[4792]: E0309 09:07:16.457625 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.200:6443: connect: connection refused" node="crc" Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.595043 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.667053 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1b7fd0fd13657af5c26d50a03d21d15f026655084104ac797a0bf8617c11a408"} Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.668592 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7059712e162aa8ef8ead4ad77c0af05905c99e531c073e3722b25539c218a116"} Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.669965 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"62d345a9d20cc01e5e1b28d3b80c8286988b24a90131932e273e1bb64a66206e"} Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.671416 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e05a47745682c9d9941c78db389c7a87e3dcf9119eb589f8c2b9cc90508b0115"} Mar 09 09:07:16 crc kubenswrapper[4792]: I0309 09:07:16.672385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0990a7ba9c90b43708e40abcd9c8a27b58d150b64bd5889fb0a84ae9ac40daf8"} Mar 09 09:07:16 crc kubenswrapper[4792]: W0309 09:07:16.682406 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 09 09:07:16 crc kubenswrapper[4792]: E0309 09:07:16.682506 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:07:16 crc kubenswrapper[4792]: W0309 09:07:16.716611 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 09 09:07:16 crc kubenswrapper[4792]: E0309 09:07:16.716688 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:07:16 crc kubenswrapper[4792]: W0309 09:07:16.980109 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 09 09:07:16 crc kubenswrapper[4792]: E0309 09:07:16.980182 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:07:17 crc kubenswrapper[4792]: E0309 09:07:17.012689 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="1.6s" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.258196 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.259658 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.259713 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.259722 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.259749 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:07:17 crc kubenswrapper[4792]: E0309 09:07:17.260432 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.200:6443: connect: connection refused" node="crc" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.594726 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.663240 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 09:07:17 crc kubenswrapper[4792]: E0309 09:07:17.663963 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.680303 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d3d9cf24a9d5a60bcaea9c6889c23037b8b47d7eb60c2458147579cd9ec75176"} Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.680343 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"544366c65452de29fd10c69bba980e991f5e2a3a09e98a9e66a050b1a06d4280"} Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.680353 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4"} Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.680362 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae"} Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.680693 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.681672 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.681694 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.681702 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.683308 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571" exitCode=0 Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.683354 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571"} Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.683541 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.684350 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.684394 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.684413 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.685319 4792 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac" exitCode=0 Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.685505 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac"} Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.685574 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.686976 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.689140 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.689162 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.689176 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.690596 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.690622 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.690634 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.694046 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9"} Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.694343 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.694336 4792 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9" exitCode=0 Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.695812 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.696103 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.696291 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.697806 4792 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="2775cf70a937d3b2d439abcc43d3f09389f296ff4e0b9339b30f9b2f4d5a28bb" exitCode=0 Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.697861 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"2775cf70a937d3b2d439abcc43d3f09389f296ff4e0b9339b30f9b2f4d5a28bb"} Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.698049 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.702975 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.703098 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:17 crc kubenswrapper[4792]: I0309 09:07:17.703118 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.595306 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 09 09:07:18 crc kubenswrapper[4792]: E0309 09:07:18.613425 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="3.2s" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.707631 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"08e5e8d9946c1c7ec4aa7ad59c1c8630fb44d2e7312e4a3b9c4e11d821fd76c7"} Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.707673 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fed911be84528f8d1ff84a7935ed4cec34862d88edfc4fe0315e0c4146c2fceb"} Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.707683 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"009a3753ff77a9835a1148b320e30a1456103acfa142693bc6835d86911c3f96"} Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.707801 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.708593 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.708615 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.708624 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.711272 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71"} Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.711309 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409"} Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.711318 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf"} Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.711326 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41"} Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.715810 4792 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548" exitCode=0 Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.715860 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548"} Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.715959 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.716552 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.716578 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.716588 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.721743 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6"} Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.721818 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.721881 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.723226 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.723254 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.723264 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.723288 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.723311 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.723320 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.861296 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.863260 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.863321 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.863337 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4792]: I0309 09:07:18.863393 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:07:18 crc kubenswrapper[4792]: E0309 09:07:18.864220 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.200:6443: connect: connection refused" node="crc" Mar 09 09:07:19 crc kubenswrapper[4792]: W0309 09:07:19.146654 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.200:6443: connect: connection refused Mar 09 09:07:19 crc kubenswrapper[4792]: E0309 09:07:19.146730 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.200:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.728204 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1131753af6a772f1d6ebb1e35075d8df2efb1450609361284e8a2b29e3d9e933"} Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.728305 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.729458 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.729485 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.729496 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.731233 4792 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617" exitCode=0 Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.731298 4792 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.731325 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.731375 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617"} Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.731438 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.731477 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.732891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.732900 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.732937 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.732959 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.732909 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.733009 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.733020 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.733058 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4792]: I0309 09:07:19.733147 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.224891 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.736834 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.737262 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e"} Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.737287 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147"} Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.737298 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e"} Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.737306 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3"} Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.737664 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.737683 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.737691 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.880773 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.880996 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.882476 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.882519 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:20 crc kubenswrapper[4792]: I0309 09:07:20.882528 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.078859 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.079207 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.081266 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.081349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.081380 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.744712 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.745189 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.745462 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e"} Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.746122 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.746146 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.746155 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.746630 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.746649 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.746657 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:21 crc kubenswrapper[4792]: I0309 09:07:21.901532 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.065180 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.066503 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.066583 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.066639 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.066712 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.747222 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.748290 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.748316 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.748327 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.863152 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.863265 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.863968 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.863989 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:22 crc kubenswrapper[4792]: I0309 09:07:22.863996 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:23 crc kubenswrapper[4792]: I0309 09:07:23.881296 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:07:23 crc kubenswrapper[4792]: I0309 09:07:23.882251 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:07:24 crc kubenswrapper[4792]: I0309 09:07:24.556594 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:24 crc kubenswrapper[4792]: I0309 09:07:24.556771 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:24 crc kubenswrapper[4792]: I0309 09:07:24.557727 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:24 crc kubenswrapper[4792]: I0309 09:07:24.557752 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:24 crc kubenswrapper[4792]: I0309 09:07:24.557762 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.368838 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.369001 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.369948 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.370036 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.370138 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.374347 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.374626 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.375905 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.375943 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.375952 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.380029 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.755308 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:25 crc kubenswrapper[4792]: E0309 09:07:25.756240 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.757117 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.757156 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.757171 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4792]: I0309 09:07:25.967847 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:26 crc kubenswrapper[4792]: I0309 09:07:26.394223 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 09 09:07:26 crc kubenswrapper[4792]: I0309 09:07:26.394462 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:26 crc kubenswrapper[4792]: I0309 09:07:26.395780 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:26 crc kubenswrapper[4792]: I0309 09:07:26.395816 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:26 crc kubenswrapper[4792]: I0309 09:07:26.395829 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:26 crc kubenswrapper[4792]: I0309 09:07:26.757786 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:26 crc kubenswrapper[4792]: I0309 09:07:26.758742 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:26 crc kubenswrapper[4792]: I0309 09:07:26.758792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:26 crc kubenswrapper[4792]: I0309 09:07:26.758806 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:26 crc kubenswrapper[4792]: I0309 09:07:26.762241 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:27 crc kubenswrapper[4792]: I0309 09:07:27.504291 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:27 crc kubenswrapper[4792]: I0309 09:07:27.760035 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:27 crc kubenswrapper[4792]: I0309 09:07:27.761204 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:27 crc kubenswrapper[4792]: I0309 09:07:27.761256 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:27 crc kubenswrapper[4792]: I0309 09:07:27.761274 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:28 crc kubenswrapper[4792]: I0309 09:07:28.762668 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:28 crc kubenswrapper[4792]: I0309 09:07:28.763651 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:28 crc kubenswrapper[4792]: I0309 09:07:28.763688 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:28 crc kubenswrapper[4792]: I0309 09:07:28.763698 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:29 crc kubenswrapper[4792]: W0309 09:07:29.591931 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 09:07:29 crc kubenswrapper[4792]: I0309 09:07:29.592051 4792 trace.go:236] Trace[2038438274]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 09:07:19.589) (total time: 10002ms): Mar 09 09:07:29 crc kubenswrapper[4792]: Trace[2038438274]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (09:07:29.591) Mar 09 09:07:29 crc kubenswrapper[4792]: Trace[2038438274]: [10.002117586s] [10.002117586s] END Mar 09 09:07:29 crc kubenswrapper[4792]: E0309 09:07:29.592094 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 09:07:29 crc kubenswrapper[4792]: I0309 09:07:29.596622 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 09 09:07:29 crc kubenswrapper[4792]: W0309 09:07:29.638455 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 09:07:29 crc kubenswrapper[4792]: I0309 09:07:29.638587 4792 trace.go:236] Trace[233215405]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 09:07:19.637) (total time: 10001ms): Mar 09 09:07:29 crc kubenswrapper[4792]: Trace[233215405]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:07:29.638) Mar 09 09:07:29 crc kubenswrapper[4792]: Trace[233215405]: [10.001551922s] [10.001551922s] END Mar 09 09:07:29 crc kubenswrapper[4792]: E0309 09:07:29.638618 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 09:07:29 crc kubenswrapper[4792]: W0309 09:07:29.850404 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 09:07:29 crc kubenswrapper[4792]: I0309 09:07:29.850484 4792 trace.go:236] Trace[1514132355]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 09:07:19.849) (total time: 10001ms): Mar 09 09:07:29 crc kubenswrapper[4792]: Trace[1514132355]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:07:29.850) Mar 09 09:07:29 crc kubenswrapper[4792]: Trace[1514132355]: [10.001247664s] [10.001247664s] END Mar 09 09:07:29 crc kubenswrapper[4792]: E0309 09:07:29.850506 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 09:07:30 crc kubenswrapper[4792]: E0309 09:07:30.051307 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:30Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 09 09:07:30 crc kubenswrapper[4792]: E0309 09:07:30.056704 4792 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:07:30 crc kubenswrapper[4792]: E0309 09:07:30.057759 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:30Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b211117781d84 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.592330628 +0000 UTC m=+0.622531380,LastTimestamp:2026-03-09 09:07:15.592330628 +0000 UTC m=+0.622531380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.058413 4792 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.058461 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 09:07:30 crc kubenswrapper[4792]: W0309 09:07:30.062432 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:30Z is after 2026-02-23T05:33:13Z Mar 09 09:07:30 crc kubenswrapper[4792]: E0309 09:07:30.062490 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.065006 4792 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.065044 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 09:07:30 crc kubenswrapper[4792]: E0309 09:07:30.071122 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:30Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.331214 4792 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:39698->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.331289 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:39698->192.168.126.11:17697: read: connection reset by peer" Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.597916 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:30Z is after 2026-02-23T05:33:13Z Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.766990 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.768322 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1131753af6a772f1d6ebb1e35075d8df2efb1450609361284e8a2b29e3d9e933" exitCode=255 Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.768360 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1131753af6a772f1d6ebb1e35075d8df2efb1450609361284e8a2b29e3d9e933"} Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.768504 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.769189 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.769316 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.769426 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:30 crc kubenswrapper[4792]: I0309 09:07:30.769994 4792 scope.go:117] "RemoveContainer" containerID="1131753af6a772f1d6ebb1e35075d8df2efb1450609361284e8a2b29e3d9e933" Mar 09 09:07:31 crc kubenswrapper[4792]: I0309 09:07:31.598724 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:31Z is after 2026-02-23T05:33:13Z Mar 09 09:07:31 crc kubenswrapper[4792]: I0309 09:07:31.772362 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 09:07:31 crc kubenswrapper[4792]: I0309 09:07:31.772972 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 09:07:31 crc kubenswrapper[4792]: I0309 09:07:31.776463 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c29c41cc1a6c9016971f8edc2d8cb9d0636af81983ae689343a5b1df158b8860" exitCode=255 Mar 09 09:07:31 crc kubenswrapper[4792]: I0309 09:07:31.776533 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c29c41cc1a6c9016971f8edc2d8cb9d0636af81983ae689343a5b1df158b8860"} Mar 09 09:07:31 crc kubenswrapper[4792]: I0309 09:07:31.776619 4792 scope.go:117] "RemoveContainer" containerID="1131753af6a772f1d6ebb1e35075d8df2efb1450609361284e8a2b29e3d9e933" Mar 09 09:07:31 crc kubenswrapper[4792]: I0309 09:07:31.776849 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:31 crc kubenswrapper[4792]: I0309 09:07:31.780027 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:31 crc kubenswrapper[4792]: I0309 09:07:31.780111 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:31 crc kubenswrapper[4792]: I0309 09:07:31.780136 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:31 crc kubenswrapper[4792]: I0309 09:07:31.782573 4792 scope.go:117] "RemoveContainer" containerID="c29c41cc1a6c9016971f8edc2d8cb9d0636af81983ae689343a5b1df158b8860" Mar 09 09:07:31 crc kubenswrapper[4792]: E0309 09:07:31.782929 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:07:32 crc kubenswrapper[4792]: I0309 09:07:32.601043 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:32Z is after 2026-02-23T05:33:13Z Mar 09 09:07:32 crc kubenswrapper[4792]: I0309 09:07:32.780591 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 09:07:32 crc kubenswrapper[4792]: I0309 09:07:32.869455 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:32 crc kubenswrapper[4792]: I0309 09:07:32.869785 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:32 crc kubenswrapper[4792]: I0309 09:07:32.871949 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:32 crc kubenswrapper[4792]: I0309 09:07:32.872021 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:32 crc kubenswrapper[4792]: I0309 09:07:32.872040 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:32 crc kubenswrapper[4792]: I0309 09:07:32.873134 4792 scope.go:117] "RemoveContainer" containerID="c29c41cc1a6c9016971f8edc2d8cb9d0636af81983ae689343a5b1df158b8860" Mar 09 09:07:32 crc kubenswrapper[4792]: E0309 09:07:32.873425 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:07:32 crc kubenswrapper[4792]: I0309 09:07:32.875666 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:33 crc kubenswrapper[4792]: W0309 09:07:33.321693 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:33Z is after 2026-02-23T05:33:13Z Mar 09 09:07:33 crc kubenswrapper[4792]: E0309 09:07:33.321789 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:07:33 crc kubenswrapper[4792]: W0309 09:07:33.381976 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:33Z is after 2026-02-23T05:33:13Z Mar 09 09:07:33 crc kubenswrapper[4792]: E0309 09:07:33.382098 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:07:33 crc kubenswrapper[4792]: I0309 09:07:33.598060 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:33Z is after 2026-02-23T05:33:13Z Mar 09 09:07:33 crc kubenswrapper[4792]: I0309 09:07:33.717951 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:33 crc kubenswrapper[4792]: I0309 09:07:33.787311 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:33 crc kubenswrapper[4792]: I0309 09:07:33.788342 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:33 crc kubenswrapper[4792]: I0309 09:07:33.788424 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:33 crc kubenswrapper[4792]: I0309 09:07:33.788437 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:33 crc kubenswrapper[4792]: I0309 09:07:33.789276 4792 scope.go:117] "RemoveContainer" containerID="c29c41cc1a6c9016971f8edc2d8cb9d0636af81983ae689343a5b1df158b8860" Mar 09 09:07:33 crc kubenswrapper[4792]: E0309 09:07:33.789538 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:07:33 crc kubenswrapper[4792]: I0309 09:07:33.882047 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:07:33 crc kubenswrapper[4792]: I0309 09:07:33.882205 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:07:34 crc kubenswrapper[4792]: I0309 09:07:34.600349 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:34Z is after 2026-02-23T05:33:13Z Mar 09 09:07:34 crc kubenswrapper[4792]: I0309 09:07:34.790190 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:34 crc kubenswrapper[4792]: I0309 09:07:34.791749 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:34 crc kubenswrapper[4792]: I0309 09:07:34.791835 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:34 crc kubenswrapper[4792]: I0309 09:07:34.791860 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:34 crc kubenswrapper[4792]: I0309 09:07:34.793138 4792 scope.go:117] "RemoveContainer" containerID="c29c41cc1a6c9016971f8edc2d8cb9d0636af81983ae689343a5b1df158b8860" Mar 09 09:07:34 crc kubenswrapper[4792]: E0309 09:07:34.793531 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:07:35 crc kubenswrapper[4792]: W0309 09:07:35.431105 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:35Z is after 2026-02-23T05:33:13Z Mar 09 09:07:35 crc kubenswrapper[4792]: E0309 09:07:35.431182 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:07:35 crc kubenswrapper[4792]: I0309 09:07:35.598232 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:35Z is after 2026-02-23T05:33:13Z Mar 09 09:07:35 crc kubenswrapper[4792]: E0309 09:07:35.756567 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.429998 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.430355 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.432091 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.432135 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.432148 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.444796 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 09 09:07:36 crc kubenswrapper[4792]: E0309 09:07:36.455033 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:36Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.472303 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.474040 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.474133 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.474154 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.474196 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:07:36 crc kubenswrapper[4792]: E0309 09:07:36.478513 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:36Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.602190 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:36Z is after 2026-02-23T05:33:13Z Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.798246 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.801181 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.801305 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:36 crc kubenswrapper[4792]: I0309 09:07:36.801336 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:37 crc kubenswrapper[4792]: I0309 09:07:37.600021 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:38 crc kubenswrapper[4792]: W0309 09:07:38.257170 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 09 09:07:38 crc kubenswrapper[4792]: E0309 09:07:38.257234 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 09:07:38 crc kubenswrapper[4792]: I0309 09:07:38.536942 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 09:07:38 crc kubenswrapper[4792]: I0309 09:07:38.556484 4792 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 09:07:38 crc kubenswrapper[4792]: I0309 09:07:38.602883 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:39 crc kubenswrapper[4792]: I0309 09:07:39.602036 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.062806 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b211117781d84 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.592330628 +0000 UTC m=+0.622531380,LastTimestamp:2026-03-09 09:07:15.592330628 +0000 UTC m=+0.622531380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.067028 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afdff30 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651436336 +0000 UTC m=+0.681637098,LastTimestamp:2026-03-09 09:07:15.651436336 +0000 UTC m=+0.681637098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.069212 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe3fcc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651452876 +0000 UTC m=+0.681653638,LastTimestamp:2026-03-09 09:07:15.651452876 +0000 UTC m=+0.681653638,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.071788 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe6a93 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651463827 +0000 UTC m=+0.681664599,LastTimestamp:2026-03-09 09:07:15.651463827 +0000 UTC m=+0.681664599,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.076195 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afdff30\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afdff30 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651436336 +0000 UTC m=+0.681637098,LastTimestamp:2026-03-09 09:07:15.762979313 +0000 UTC m=+0.793180065,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.078409 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afe3fcc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe3fcc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651452876 +0000 UTC m=+0.681653638,LastTimestamp:2026-03-09 09:07:15.763018255 +0000 UTC m=+0.793218997,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.084779 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afe6a93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe6a93 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651463827 +0000 UTC m=+0.681664599,LastTimestamp:2026-03-09 09:07:15.763026365 +0000 UTC m=+0.793227117,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.093369 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b211121e3d5c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.767162306 +0000 UTC m=+0.797363068,LastTimestamp:2026-03-09 09:07:15.767162306 +0000 UTC m=+0.797363068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.102728 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afdff30\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afdff30 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651436336 +0000 UTC m=+0.681637098,LastTimestamp:2026-03-09 09:07:15.767382082 +0000 UTC m=+0.797582844,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.107613 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afdff30\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afdff30 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651436336 +0000 UTC m=+0.681637098,LastTimestamp:2026-03-09 09:07:15.767403952 +0000 UTC m=+0.797604724,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.112394 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afe3fcc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe3fcc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651452876 +0000 UTC m=+0.681653638,LastTimestamp:2026-03-09 09:07:15.767409062 +0000 UTC m=+0.797609834,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.117967 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afe3fcc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe3fcc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651452876 +0000 UTC m=+0.681653638,LastTimestamp:2026-03-09 09:07:15.767428073 +0000 UTC m=+0.797628855,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.124719 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afe6a93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe6a93 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651463827 +0000 UTC m=+0.681664599,LastTimestamp:2026-03-09 09:07:15.767444923 +0000 UTC m=+0.797645695,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.130980 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afe6a93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe6a93 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651463827 +0000 UTC m=+0.681664599,LastTimestamp:2026-03-09 09:07:15.767457114 +0000 UTC m=+0.797657886,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.137240 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afdff30\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afdff30 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651436336 +0000 UTC m=+0.681637098,LastTimestamp:2026-03-09 09:07:15.77105124 +0000 UTC m=+0.801251992,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.142446 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afe3fcc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe3fcc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651452876 +0000 UTC m=+0.681653638,LastTimestamp:2026-03-09 09:07:15.771098012 +0000 UTC m=+0.801298764,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.149525 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afe6a93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe6a93 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651463827 +0000 UTC m=+0.681664599,LastTimestamp:2026-03-09 09:07:15.771114862 +0000 UTC m=+0.801315614,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.155933 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afdff30\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afdff30 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651436336 +0000 UTC m=+0.681637098,LastTimestamp:2026-03-09 09:07:15.771176934 +0000 UTC m=+0.801377696,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.160845 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afe3fcc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe3fcc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651452876 +0000 UTC m=+0.681653638,LastTimestamp:2026-03-09 09:07:15.771194724 +0000 UTC m=+0.801395496,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.167548 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afe6a93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe6a93 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651463827 +0000 UTC m=+0.681664599,LastTimestamp:2026-03-09 09:07:15.771215325 +0000 UTC m=+0.801416097,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.171792 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afdff30\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afdff30 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651436336 +0000 UTC m=+0.681637098,LastTimestamp:2026-03-09 09:07:15.773393383 +0000 UTC m=+0.803594155,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.176043 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afe3fcc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe3fcc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651452876 +0000 UTC m=+0.681653638,LastTimestamp:2026-03-09 09:07:15.773420814 +0000 UTC m=+0.803621586,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.180429 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afe6a93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe6a93 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651463827 +0000 UTC m=+0.681664599,LastTimestamp:2026-03-09 09:07:15.773440704 +0000 UTC m=+0.803641466,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.185088 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afdff30\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afdff30 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651436336 +0000 UTC m=+0.681637098,LastTimestamp:2026-03-09 09:07:15.774492893 +0000 UTC m=+0.804693645,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.189497 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b21111afe3fcc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b21111afe3fcc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:15.651452876 +0000 UTC m=+0.681653638,LastTimestamp:2026-03-09 09:07:15.774508133 +0000 UTC m=+0.804708885,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.196218 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21113966e049 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.161626185 +0000 UTC m=+1.191826947,LastTimestamp:2026-03-09 09:07:16.161626185 +0000 UTC m=+1.191826947,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.200727 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111396a9b64 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.161870692 +0000 UTC m=+1.192071444,LastTimestamp:2026-03-09 09:07:16.161870692 +0000 UTC m=+1.192071444,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.204629 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21113a4712d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.17631919 +0000 UTC m=+1.206519942,LastTimestamp:2026-03-09 09:07:16.17631919 +0000 UTC m=+1.206519942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.209300 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b21113a4715ba openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.17631993 +0000 UTC m=+1.206520682,LastTimestamp:2026-03-09 09:07:16.17631993 +0000 UTC m=+1.206520682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.213803 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21113a50cac1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.176956097 +0000 UTC m=+1.207156849,LastTimestamp:2026-03-09 09:07:16.176956097 +0000 UTC m=+1.207156849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.218530 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21115f57e51e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.79817859 +0000 UTC m=+1.828379342,LastTimestamp:2026-03-09 09:07:16.79817859 +0000 UTC m=+1.828379342,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.222046 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21115f647efc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.799004412 +0000 UTC m=+1.829205164,LastTimestamp:2026-03-09 09:07:16.799004412 +0000 UTC m=+1.829205164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: I0309 09:07:40.225702 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:40 crc kubenswrapper[4792]: I0309 09:07:40.225845 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:40 crc kubenswrapper[4792]: I0309 09:07:40.226797 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:40 crc kubenswrapper[4792]: I0309 09:07:40.226823 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:40 crc kubenswrapper[4792]: I0309 09:07:40.226831 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:40 crc kubenswrapper[4792]: I0309 09:07:40.227274 4792 scope.go:117] "RemoveContainer" containerID="c29c41cc1a6c9016971f8edc2d8cb9d0636af81983ae689343a5b1df158b8860" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.227204 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21115f68b451 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.799280209 +0000 UTC m=+1.829480961,LastTimestamp:2026-03-09 09:07:16.799280209 +0000 UTC m=+1.829480961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.227424 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.232195 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21115f706033 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.799782963 +0000 UTC m=+1.829983715,LastTimestamp:2026-03-09 09:07:16.799782963 +0000 UTC m=+1.829983715,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.236713 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b21115f712faa openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.799836074 +0000 UTC m=+1.830036826,LastTimestamp:2026-03-09 09:07:16.799836074 +0000 UTC m=+1.830036826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.240905 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b21116016bde6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.810685926 +0000 UTC m=+1.840886678,LastTimestamp:2026-03-09 09:07:16.810685926 +0000 UTC m=+1.840886678,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.244877 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2111602366f7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.811515639 +0000 UTC m=+1.841716391,LastTimestamp:2026-03-09 09:07:16.811515639 +0000 UTC m=+1.841716391,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.248943 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2111602d4bc3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.812164035 +0000 UTC m=+1.842364787,LastTimestamp:2026-03-09 09:07:16.812164035 +0000 UTC m=+1.842364787,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.253288 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21116038e6d2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.812924626 +0000 UTC m=+1.843125378,LastTimestamp:2026-03-09 09:07:16.812924626 +0000 UTC m=+1.843125378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.257853 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b21116044d3df openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.813706207 +0000 UTC m=+1.843906959,LastTimestamp:2026-03-09 09:07:16.813706207 +0000 UTC m=+1.843906959,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.262309 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111609e0c5e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.819553374 +0000 UTC m=+1.849754146,LastTimestamp:2026-03-09 09:07:16.819553374 +0000 UTC m=+1.849754146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.266826 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21117263fb6c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.117737836 +0000 UTC m=+2.147938618,LastTimestamp:2026-03-09 09:07:17.117737836 +0000 UTC m=+2.147938618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.270864 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2111738c8010 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.137170448 +0000 UTC m=+2.167371210,LastTimestamp:2026-03-09 09:07:17.137170448 +0000 UTC m=+2.167371210,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.272795 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b211173a799b6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.138946486 +0000 UTC m=+2.169147248,LastTimestamp:2026-03-09 09:07:17.138946486 +0000 UTC m=+2.169147248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.275552 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2111814bf307 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.367821063 +0000 UTC m=+2.398021815,LastTimestamp:2026-03-09 09:07:17.367821063 +0000 UTC m=+2.398021815,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.277011 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2111827fd7f6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.387999222 +0000 UTC m=+2.418199974,LastTimestamp:2026-03-09 09:07:17.387999222 +0000 UTC m=+2.418199974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.280193 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2111828ecda1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.388979617 +0000 UTC m=+2.419180369,LastTimestamp:2026-03-09 09:07:17.388979617 +0000 UTC m=+2.419180369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.284117 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b211190095206 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.61511271 +0000 UTC m=+2.645313462,LastTimestamp:2026-03-09 09:07:17.61511271 +0000 UTC m=+2.645313462,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.288129 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b211190addf3e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.625896766 +0000 UTC m=+2.656097518,LastTimestamp:2026-03-09 09:07:17.625896766 +0000 UTC m=+2.656097518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.291669 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111944f6944 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.686815044 +0000 UTC m=+2.717015796,LastTimestamp:2026-03-09 09:07:17.686815044 +0000 UTC m=+2.717015796,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.295837 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21119488cf10 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.690576656 +0000 UTC m=+2.720777408,LastTimestamp:2026-03-09 09:07:17.690576656 +0000 UTC m=+2.720777408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.300893 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b21119507fe4e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.698911822 +0000 UTC m=+2.729112574,LastTimestamp:2026-03-09 09:07:17.698911822 +0000 UTC m=+2.729112574,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.304452 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b211195ae55b4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.709813172 +0000 UTC m=+2.740013924,LastTimestamp:2026-03-09 09:07:17.709813172 +0000 UTC m=+2.740013924,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.307758 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2111a1ab3571 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.910934897 +0000 UTC m=+2.941135649,LastTimestamp:2026-03-09 09:07:17.910934897 +0000 UTC m=+2.941135649,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.312139 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2111a1c9243f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.912896575 +0000 UTC m=+2.943097327,LastTimestamp:2026-03-09 09:07:17.912896575 +0000 UTC m=+2.943097327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.315911 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2111a1cda930 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.913192752 +0000 UTC m=+2.943393504,LastTimestamp:2026-03-09 09:07:17.913192752 +0000 UTC m=+2.943393504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.319411 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111a1dadad0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.914057424 +0000 UTC m=+2.944258176,LastTimestamp:2026-03-09 09:07:17.914057424 +0000 UTC m=+2.944258176,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.323784 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2111a2c85ec7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.929623239 +0000 UTC m=+2.959823991,LastTimestamp:2026-03-09 09:07:17.929623239 +0000 UTC m=+2.959823991,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.327369 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2111a2dc4461 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.930927201 +0000 UTC m=+2.961127953,LastTimestamp:2026-03-09 09:07:17.930927201 +0000 UTC m=+2.961127953,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.331889 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111a30f10eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.934256363 +0000 UTC m=+2.964457115,LastTimestamp:2026-03-09 09:07:17.934256363 +0000 UTC m=+2.964457115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.335625 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b2111a3418146 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.937561926 +0000 UTC m=+2.967762678,LastTimestamp:2026-03-09 09:07:17.937561926 +0000 UTC m=+2.967762678,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.339145 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111a34e45e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.938398696 +0000 UTC m=+2.968599448,LastTimestamp:2026-03-09 09:07:17.938398696 +0000 UTC m=+2.968599448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.343964 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2111a399fbb7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.943360439 +0000 UTC m=+2.973561191,LastTimestamp:2026-03-09 09:07:17.943360439 +0000 UTC m=+2.973561191,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.347607 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2111b0425c72 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.155721842 +0000 UTC m=+3.185922594,LastTimestamp:2026-03-09 09:07:18.155721842 +0000 UTC m=+3.185922594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.351602 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111b0576cec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.157102316 +0000 UTC m=+3.187303068,LastTimestamp:2026-03-09 09:07:18.157102316 +0000 UTC m=+3.187303068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.355443 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2111b1167010 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.169620496 +0000 UTC m=+3.199821248,LastTimestamp:2026-03-09 09:07:18.169620496 +0000 UTC m=+3.199821248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.359149 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2111b127bbd3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.170754003 +0000 UTC m=+3.200954755,LastTimestamp:2026-03-09 09:07:18.170754003 +0000 UTC m=+3.200954755,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.363423 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111b185c528 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.176916776 +0000 UTC m=+3.207117528,LastTimestamp:2026-03-09 09:07:18.176916776 +0000 UTC m=+3.207117528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.366654 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111b1b6b655 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.180124245 +0000 UTC m=+3.210324987,LastTimestamp:2026-03-09 09:07:18.180124245 +0000 UTC m=+3.210324987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.370318 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111be93f90d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.395951373 +0000 UTC m=+3.426152125,LastTimestamp:2026-03-09 09:07:18.395951373 +0000 UTC m=+3.426152125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.374285 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2111bebd9af9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.398679801 +0000 UTC m=+3.428880553,LastTimestamp:2026-03-09 09:07:18.398679801 +0000 UTC m=+3.428880553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.377182 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111bfd51294 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.416994964 +0000 UTC m=+3.447195716,LastTimestamp:2026-03-09 09:07:18.416994964 +0000 UTC m=+3.447195716,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.378403 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111bfe46005 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.417997829 +0000 UTC m=+3.448198581,LastTimestamp:2026-03-09 09:07:18.417997829 +0000 UTC m=+3.448198581,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.380784 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b2111c016ac00 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.42129408 +0000 UTC m=+3.451494832,LastTimestamp:2026-03-09 09:07:18.42129408 +0000 UTC m=+3.451494832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.383906 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111cad65637 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.601627191 +0000 UTC m=+3.631827933,LastTimestamp:2026-03-09 09:07:18.601627191 +0000 UTC m=+3.631827933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.387143 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111cbc43352 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.617215826 +0000 UTC m=+3.647416568,LastTimestamp:2026-03-09 09:07:18.617215826 +0000 UTC m=+3.647416568,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.391892 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111cbdf9e5d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.619012701 +0000 UTC m=+3.649213453,LastTimestamp:2026-03-09 09:07:18.619012701 +0000 UTC m=+3.649213453,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.396035 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2111d1c14173 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.717686131 +0000 UTC m=+3.747886883,LastTimestamp:2026-03-09 09:07:18.717686131 +0000 UTC m=+3.747886883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.401793 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111d952a838 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.844655672 +0000 UTC m=+3.874856424,LastTimestamp:2026-03-09 09:07:18.844655672 +0000 UTC m=+3.874856424,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.405831 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111dad8eb43 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.870231875 +0000 UTC m=+3.900432627,LastTimestamp:2026-03-09 09:07:18.870231875 +0000 UTC m=+3.900432627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.409180 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2111dd72d9b2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.913874354 +0000 UTC m=+3.944075096,LastTimestamp:2026-03-09 09:07:18.913874354 +0000 UTC m=+3.944075096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.411905 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2111de1f921a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.925193754 +0000 UTC m=+3.955394506,LastTimestamp:2026-03-09 09:07:18.925193754 +0000 UTC m=+3.955394506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.416841 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21120e5f339c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:19.734670236 +0000 UTC m=+4.764870988,LastTimestamp:2026-03-09 09:07:19.734670236 +0000 UTC m=+4.764870988,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.420257 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21121a0febfb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:19.930801147 +0000 UTC m=+4.961001889,LastTimestamp:2026-03-09 09:07:19.930801147 +0000 UTC m=+4.961001889,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.423961 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21121ae355ab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:19.944656299 +0000 UTC m=+4.974857051,LastTimestamp:2026-03-09 09:07:19.944656299 +0000 UTC m=+4.974857051,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.427332 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21121af612e6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:19.94588439 +0000 UTC m=+4.976085142,LastTimestamp:2026-03-09 09:07:19.94588439 +0000 UTC m=+4.976085142,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.431381 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2112266a4be5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:20.138050533 +0000 UTC m=+5.168251285,LastTimestamp:2026-03-09 09:07:20.138050533 +0000 UTC m=+5.168251285,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.434642 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b211226f87f1a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:20.147369754 +0000 UTC m=+5.177570506,LastTimestamp:2026-03-09 09:07:20.147369754 +0000 UTC m=+5.177570506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.437796 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21122703763e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:20.148088382 +0000 UTC m=+5.178289134,LastTimestamp:2026-03-09 09:07:20.148088382 +0000 UTC m=+5.178289134,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.440710 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b2112300cc668 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:20.299693672 +0000 UTC m=+5.329894424,LastTimestamp:2026-03-09 09:07:20.299693672 +0000 UTC m=+5.329894424,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.444596 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b211230e44fd3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:20.313819091 +0000 UTC m=+5.344019853,LastTimestamp:2026-03-09 09:07:20.313819091 +0000 UTC m=+5.344019853,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.447794 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21123103f5be openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:20.315893182 +0000 UTC m=+5.346093954,LastTimestamp:2026-03-09 09:07:20.315893182 +0000 UTC m=+5.346093954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.451797 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21124064948e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:20.573883534 +0000 UTC m=+5.604084286,LastTimestamp:2026-03-09 09:07:20.573883534 +0000 UTC m=+5.604084286,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.454943 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21124144914e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:20.588562766 +0000 UTC m=+5.618763518,LastTimestamp:2026-03-09 09:07:20.588562766 +0000 UTC m=+5.618763518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.458063 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b211241546854 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:20.589600852 +0000 UTC m=+5.619801604,LastTimestamp:2026-03-09 09:07:20.589600852 +0000 UTC m=+5.619801604,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.461248 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21124d02f5b0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:20.78558968 +0000 UTC m=+5.815790462,LastTimestamp:2026-03-09 09:07:20.78558968 +0000 UTC m=+5.815790462,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.464160 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21124e10749d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:20.803251357 +0000 UTC m=+5.833452109,LastTimestamp:2026-03-09 09:07:20.803251357 +0000 UTC m=+5.833452109,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.468180 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 09:07:40 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2113059549b8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 09 09:07:40 crc kubenswrapper[4792]: body: Mar 09 09:07:40 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:23.882187192 +0000 UTC m=+8.912387994,LastTimestamp:2026-03-09 09:07:23.882187192 +0000 UTC m=+8.912387994,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:07:40 crc kubenswrapper[4792]: > Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.471248 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21130598795d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:23.882395997 +0000 UTC m=+8.912596789,LastTimestamp:2026-03-09 09:07:23.882395997 +0000 UTC m=+8.912596789,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.475394 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 09:07:40 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-apiserver-crc.189b211475b788fa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 09:07:40 crc kubenswrapper[4792]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 09 09:07:40 crc kubenswrapper[4792]: Mar 09 09:07:40 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:30.058447098 +0000 UTC m=+15.088647850,LastTimestamp:2026-03-09 09:07:30.058447098 +0000 UTC m=+15.088647850,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:07:40 crc kubenswrapper[4792]: > Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.478818 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b211475b80e06 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:30.058481158 +0000 UTC m=+15.088681910,LastTimestamp:2026-03-09 09:07:30.058481158 +0000 UTC m=+15.088681910,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.481826 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 09:07:40 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-apiserver-crc.189b2114761c082c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 09:07:40 crc kubenswrapper[4792]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 09 09:07:40 crc kubenswrapper[4792]: Mar 09 09:07:40 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:30.06503326 +0000 UTC m=+15.095234012,LastTimestamp:2026-03-09 09:07:30.06503326 +0000 UTC m=+15.095234012,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:07:40 crc kubenswrapper[4792]: > Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.484977 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b211475b80e06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b211475b80e06 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:30.058481158 +0000 UTC m=+15.088681910,LastTimestamp:2026-03-09 09:07:30.065081841 +0000 UTC m=+15.095282593,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.488215 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 09:07:40 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-apiserver-crc.189b211485fa7da6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:39698->192.168.126.11:17697: read: connection reset by peer Mar 09 09:07:40 crc kubenswrapper[4792]: body: Mar 09 09:07:40 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:30.331270566 +0000 UTC m=+15.361471318,LastTimestamp:2026-03-09 09:07:30.331270566 +0000 UTC m=+15.361471318,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:07:40 crc kubenswrapper[4792]: > Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.492364 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b211485fb3ab7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:39698->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:30.331318967 +0000 UTC m=+15.361519719,LastTimestamp:2026-03-09 09:07:30.331318967 +0000 UTC m=+15.361519719,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.496365 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b2111cbdf9e5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2111cbdf9e5d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:18.619012701 +0000 UTC m=+3.649213453,LastTimestamp:2026-03-09 09:07:30.77090305 +0000 UTC m=+15.801103802,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.500415 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 09:07:40 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.189b211559a0dbf0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 09:07:40 crc kubenswrapper[4792]: body: Mar 09 09:07:40 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:33.882166256 +0000 UTC m=+18.912367048,LastTimestamp:2026-03-09 09:07:33.882166256 +0000 UTC m=+18.912367048,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:07:40 crc kubenswrapper[4792]: > Mar 09 09:07:40 crc kubenswrapper[4792]: E0309 09:07:40.503770 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b211559a230ce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:33.882253518 +0000 UTC m=+18.912454310,LastTimestamp:2026-03-09 09:07:33.882253518 +0000 UTC m=+18.912454310,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:40 crc kubenswrapper[4792]: I0309 09:07:40.600938 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:41 crc kubenswrapper[4792]: I0309 09:07:41.599278 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:42 crc kubenswrapper[4792]: I0309 09:07:42.599610 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:42 crc kubenswrapper[4792]: W0309 09:07:42.777026 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:42 crc kubenswrapper[4792]: E0309 09:07:42.777104 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 09 09:07:43 crc kubenswrapper[4792]: E0309 09:07:43.460375 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.478648 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.480674 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.480721 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.480738 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.480778 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:07:43 crc kubenswrapper[4792]: E0309 09:07:43.486430 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.600241 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.881770 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.881896 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.881982 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.882236 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.883649 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.883673 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.883683 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.884208 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 09:07:43 crc kubenswrapper[4792]: I0309 09:07:43.884375 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4" gracePeriod=30 Mar 09 09:07:43 crc kubenswrapper[4792]: E0309 09:07:43.889691 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b211559a0dbf0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 09:07:43 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.189b211559a0dbf0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 09:07:43 crc kubenswrapper[4792]: body: Mar 09 09:07:43 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:33.882166256 +0000 UTC m=+18.912367048,LastTimestamp:2026-03-09 09:07:43.881854438 +0000 UTC m=+28.912055230,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:07:43 crc kubenswrapper[4792]: > Mar 09 09:07:43 crc kubenswrapper[4792]: E0309 09:07:43.898423 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b211559a230ce\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b211559a230ce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:33.882253518 +0000 UTC m=+18.912454310,LastTimestamp:2026-03-09 09:07:43.88193821 +0000 UTC m=+28.912138992,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:43 crc kubenswrapper[4792]: E0309 09:07:43.902886 4792 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2117adce34d1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:43.884358865 +0000 UTC m=+28.914559617,LastTimestamp:2026-03-09 09:07:43.884358865 +0000 UTC m=+28.914559617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:44 crc kubenswrapper[4792]: E0309 09:07:44.013238 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b21116038e6d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21116038e6d2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:16.812924626 +0000 UTC m=+1.843125378,LastTimestamp:2026-03-09 09:07:44.007602321 +0000 UTC m=+29.037803073,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:44 crc kubenswrapper[4792]: E0309 09:07:44.173036 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b21117263fb6c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21117263fb6c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.117737836 +0000 UTC m=+2.147938618,LastTimestamp:2026-03-09 09:07:44.168780949 +0000 UTC m=+29.198981701,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:44 crc kubenswrapper[4792]: E0309 09:07:44.180059 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2111738c8010\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2111738c8010 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:17.137170448 +0000 UTC m=+2.167371210,LastTimestamp:2026-03-09 09:07:44.178694906 +0000 UTC m=+29.208895658,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:44 crc kubenswrapper[4792]: W0309 09:07:44.592902 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 09 09:07:44 crc kubenswrapper[4792]: E0309 09:07:44.592961 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 09:07:44 crc kubenswrapper[4792]: I0309 09:07:44.597939 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:44 crc kubenswrapper[4792]: I0309 09:07:44.819125 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 09:07:44 crc kubenswrapper[4792]: I0309 09:07:44.819728 4792 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4" exitCode=255 Mar 09 09:07:44 crc kubenswrapper[4792]: I0309 09:07:44.819768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4"} Mar 09 09:07:44 crc kubenswrapper[4792]: I0309 09:07:44.819846 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"652ed63f63bc7b81328792679f59e5d748feb5114a97f57df7ba90f3d272feff"} Mar 09 09:07:44 crc kubenswrapper[4792]: I0309 09:07:44.819933 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:44 crc kubenswrapper[4792]: I0309 09:07:44.820681 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:44 crc kubenswrapper[4792]: I0309 09:07:44.820712 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:44 crc kubenswrapper[4792]: I0309 09:07:44.820721 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:45 crc kubenswrapper[4792]: W0309 09:07:45.142779 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 09 09:07:45 crc kubenswrapper[4792]: E0309 09:07:45.143216 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 09 09:07:45 crc kubenswrapper[4792]: I0309 09:07:45.598054 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:45 crc kubenswrapper[4792]: E0309 09:07:45.756772 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:07:46 crc kubenswrapper[4792]: I0309 09:07:46.599795 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:47 crc kubenswrapper[4792]: I0309 09:07:47.505191 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:47 crc kubenswrapper[4792]: I0309 09:07:47.505564 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:47 crc kubenswrapper[4792]: I0309 09:07:47.514622 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:47 crc kubenswrapper[4792]: I0309 09:07:47.514691 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:47 crc kubenswrapper[4792]: I0309 09:07:47.514705 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:47 crc kubenswrapper[4792]: I0309 09:07:47.600344 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:48 crc kubenswrapper[4792]: I0309 09:07:48.602576 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:49 crc kubenswrapper[4792]: I0309 09:07:49.603980 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:50 crc kubenswrapper[4792]: E0309 09:07:50.466050 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:07:50 crc kubenswrapper[4792]: I0309 09:07:50.487490 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:50 crc kubenswrapper[4792]: I0309 09:07:50.489031 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:50 crc kubenswrapper[4792]: I0309 09:07:50.489131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:50 crc kubenswrapper[4792]: I0309 09:07:50.489157 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:50 crc kubenswrapper[4792]: I0309 09:07:50.489203 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:07:50 crc kubenswrapper[4792]: E0309 09:07:50.494894 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:07:50 crc kubenswrapper[4792]: I0309 09:07:50.599834 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:50 crc kubenswrapper[4792]: I0309 09:07:50.881188 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:07:50 crc kubenswrapper[4792]: I0309 09:07:50.881510 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:50 crc kubenswrapper[4792]: I0309 09:07:50.883786 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:50 crc kubenswrapper[4792]: I0309 09:07:50.883891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:50 crc kubenswrapper[4792]: I0309 09:07:50.883917 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:51 crc kubenswrapper[4792]: I0309 09:07:51.599198 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:52 crc kubenswrapper[4792]: I0309 09:07:52.599299 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:53 crc kubenswrapper[4792]: I0309 09:07:53.603675 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:53 crc kubenswrapper[4792]: I0309 09:07:53.882186 4792 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:07:53 crc kubenswrapper[4792]: I0309 09:07:53.882265 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:07:53 crc kubenswrapper[4792]: E0309 09:07:53.886864 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b211559a0dbf0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 09:07:53 crc kubenswrapper[4792]: &Event{ObjectMeta:{kube-controller-manager-crc.189b211559a0dbf0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 09:07:53 crc kubenswrapper[4792]: body: Mar 09 09:07:53 crc kubenswrapper[4792]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:33.882166256 +0000 UTC m=+18.912367048,LastTimestamp:2026-03-09 09:07:53.882242445 +0000 UTC m=+38.912443197,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:07:53 crc kubenswrapper[4792]: > Mar 09 09:07:53 crc kubenswrapper[4792]: E0309 09:07:53.891710 4792 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b211559a230ce\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b211559a230ce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:07:33.882253518 +0000 UTC m=+18.912454310,LastTimestamp:2026-03-09 09:07:53.882290997 +0000 UTC m=+38.912491749,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:07:54 crc kubenswrapper[4792]: I0309 09:07:54.599320 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:55 crc kubenswrapper[4792]: I0309 09:07:55.602299 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:55 crc kubenswrapper[4792]: I0309 09:07:55.661884 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:55 crc kubenswrapper[4792]: I0309 09:07:55.663857 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:55 crc kubenswrapper[4792]: I0309 09:07:55.663923 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:55 crc kubenswrapper[4792]: I0309 09:07:55.663941 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:55 crc kubenswrapper[4792]: I0309 09:07:55.664837 4792 scope.go:117] "RemoveContainer" containerID="c29c41cc1a6c9016971f8edc2d8cb9d0636af81983ae689343a5b1df158b8860" Mar 09 09:07:55 crc kubenswrapper[4792]: E0309 09:07:55.757872 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:07:56 crc kubenswrapper[4792]: I0309 09:07:56.599567 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:56 crc kubenswrapper[4792]: I0309 09:07:56.863502 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 09:07:56 crc kubenswrapper[4792]: I0309 09:07:56.865644 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"62ff4c7382c6f35a7bcfd4126e382b650475955a51ae685f336bbc7620276ce8"} Mar 09 09:07:56 crc kubenswrapper[4792]: I0309 09:07:56.865801 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:56 crc kubenswrapper[4792]: I0309 09:07:56.866549 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:56 crc kubenswrapper[4792]: I0309 09:07:56.866580 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:56 crc kubenswrapper[4792]: I0309 09:07:56.866590 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:57 crc kubenswrapper[4792]: E0309 09:07:57.470455 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.495920 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.497624 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.497682 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.497694 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.497720 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:07:57 crc kubenswrapper[4792]: E0309 09:07:57.502576 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.599433 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.870211 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.871447 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.873683 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="62ff4c7382c6f35a7bcfd4126e382b650475955a51ae685f336bbc7620276ce8" exitCode=255 Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.873721 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"62ff4c7382c6f35a7bcfd4126e382b650475955a51ae685f336bbc7620276ce8"} Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.873751 4792 scope.go:117] "RemoveContainer" containerID="c29c41cc1a6c9016971f8edc2d8cb9d0636af81983ae689343a5b1df158b8860" Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.873884 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.874607 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.874636 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.874644 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:57 crc kubenswrapper[4792]: I0309 09:07:57.875316 4792 scope.go:117] "RemoveContainer" containerID="62ff4c7382c6f35a7bcfd4126e382b650475955a51ae685f336bbc7620276ce8" Mar 09 09:07:57 crc kubenswrapper[4792]: E0309 09:07:57.875500 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:07:58 crc kubenswrapper[4792]: I0309 09:07:58.600088 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:58 crc kubenswrapper[4792]: W0309 09:07:58.811164 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 09 09:07:58 crc kubenswrapper[4792]: E0309 09:07:58.811463 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 09 09:07:58 crc kubenswrapper[4792]: I0309 09:07:58.877048 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 09:07:59 crc kubenswrapper[4792]: I0309 09:07:59.599113 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:00 crc kubenswrapper[4792]: I0309 09:08:00.225042 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:08:00 crc kubenswrapper[4792]: I0309 09:08:00.225679 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:00 crc kubenswrapper[4792]: I0309 09:08:00.226887 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:00 crc kubenswrapper[4792]: I0309 09:08:00.226924 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:00 crc kubenswrapper[4792]: I0309 09:08:00.226936 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:00 crc kubenswrapper[4792]: I0309 09:08:00.227432 4792 scope.go:117] "RemoveContainer" containerID="62ff4c7382c6f35a7bcfd4126e382b650475955a51ae685f336bbc7620276ce8" Mar 09 09:08:00 crc kubenswrapper[4792]: E0309 09:08:00.227622 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:08:00 crc kubenswrapper[4792]: I0309 09:08:00.600096 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:00 crc kubenswrapper[4792]: I0309 09:08:00.888326 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:08:00 crc kubenswrapper[4792]: I0309 09:08:00.888506 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:00 crc kubenswrapper[4792]: I0309 09:08:00.889509 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:00 crc kubenswrapper[4792]: I0309 09:08:00.889533 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:00 crc kubenswrapper[4792]: I0309 09:08:00.889543 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:00 crc kubenswrapper[4792]: I0309 09:08:00.893243 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:08:01 crc kubenswrapper[4792]: I0309 09:08:01.598295 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:01 crc kubenswrapper[4792]: W0309 09:08:01.755717 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 09 09:08:01 crc kubenswrapper[4792]: E0309 09:08:01.755832 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 09:08:01 crc kubenswrapper[4792]: I0309 09:08:01.889162 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:01 crc kubenswrapper[4792]: I0309 09:08:01.890112 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:01 crc kubenswrapper[4792]: I0309 09:08:01.890159 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:01 crc kubenswrapper[4792]: I0309 09:08:01.890171 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:02 crc kubenswrapper[4792]: W0309 09:08:02.153106 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 09 09:08:02 crc kubenswrapper[4792]: E0309 09:08:02.153169 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 09:08:02 crc kubenswrapper[4792]: I0309 09:08:02.822553 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:03 crc kubenswrapper[4792]: I0309 09:08:03.600919 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:03 crc kubenswrapper[4792]: I0309 09:08:03.717998 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:08:03 crc kubenswrapper[4792]: I0309 09:08:03.718217 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:03 crc kubenswrapper[4792]: I0309 09:08:03.719200 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:03 crc kubenswrapper[4792]: I0309 09:08:03.719234 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:03 crc kubenswrapper[4792]: I0309 09:08:03.719245 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:03 crc kubenswrapper[4792]: I0309 09:08:03.719774 4792 scope.go:117] "RemoveContainer" containerID="62ff4c7382c6f35a7bcfd4126e382b650475955a51ae685f336bbc7620276ce8" Mar 09 09:08:03 crc kubenswrapper[4792]: E0309 09:08:03.719925 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:08:04 crc kubenswrapper[4792]: E0309 09:08:04.478093 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:08:04 crc kubenswrapper[4792]: I0309 09:08:04.503161 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:04 crc kubenswrapper[4792]: I0309 09:08:04.504783 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:04 crc kubenswrapper[4792]: I0309 09:08:04.504837 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:04 crc kubenswrapper[4792]: I0309 09:08:04.504855 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:04 crc kubenswrapper[4792]: I0309 09:08:04.504893 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:08:04 crc kubenswrapper[4792]: E0309 09:08:04.506885 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:08:04 crc kubenswrapper[4792]: I0309 09:08:04.601584 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:05 crc kubenswrapper[4792]: W0309 09:08:05.250289 4792 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:05 crc kubenswrapper[4792]: E0309 09:08:05.250335 4792 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 09 09:08:05 crc kubenswrapper[4792]: I0309 09:08:05.599909 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:05 crc kubenswrapper[4792]: E0309 09:08:05.758439 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:08:06 crc kubenswrapper[4792]: I0309 09:08:06.598919 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:07 crc kubenswrapper[4792]: I0309 09:08:07.603687 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:08 crc kubenswrapper[4792]: I0309 09:08:08.602892 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:09 crc kubenswrapper[4792]: I0309 09:08:09.600885 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:10 crc kubenswrapper[4792]: I0309 09:08:10.602674 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:11 crc kubenswrapper[4792]: I0309 09:08:11.083303 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:08:11 crc kubenswrapper[4792]: I0309 09:08:11.083450 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:11 crc kubenswrapper[4792]: I0309 09:08:11.084680 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:11 crc kubenswrapper[4792]: I0309 09:08:11.084845 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:11 crc kubenswrapper[4792]: I0309 09:08:11.084941 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:11 crc kubenswrapper[4792]: E0309 09:08:11.483737 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:08:11 crc kubenswrapper[4792]: I0309 09:08:11.507339 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:11 crc kubenswrapper[4792]: I0309 09:08:11.508960 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:11 crc kubenswrapper[4792]: I0309 09:08:11.509017 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:11 crc kubenswrapper[4792]: I0309 09:08:11.509033 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:11 crc kubenswrapper[4792]: I0309 09:08:11.509093 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:08:11 crc kubenswrapper[4792]: E0309 09:08:11.514033 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:08:11 crc kubenswrapper[4792]: I0309 09:08:11.599055 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:12 crc kubenswrapper[4792]: I0309 09:08:12.599247 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:13 crc kubenswrapper[4792]: I0309 09:08:13.603021 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:14 crc kubenswrapper[4792]: I0309 09:08:14.601342 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:14 crc kubenswrapper[4792]: I0309 09:08:14.662249 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:14 crc kubenswrapper[4792]: I0309 09:08:14.663613 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:14 crc kubenswrapper[4792]: I0309 09:08:14.663663 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:14 crc kubenswrapper[4792]: I0309 09:08:14.663683 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:14 crc kubenswrapper[4792]: I0309 09:08:14.664498 4792 scope.go:117] "RemoveContainer" containerID="62ff4c7382c6f35a7bcfd4126e382b650475955a51ae685f336bbc7620276ce8" Mar 09 09:08:14 crc kubenswrapper[4792]: E0309 09:08:14.664726 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:08:15 crc kubenswrapper[4792]: I0309 09:08:15.597281 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:15 crc kubenswrapper[4792]: E0309 09:08:15.759445 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:08:16 crc kubenswrapper[4792]: I0309 09:08:16.600809 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:17 crc kubenswrapper[4792]: I0309 09:08:17.599526 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:18 crc kubenswrapper[4792]: E0309 09:08:18.489898 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:08:18 crc kubenswrapper[4792]: I0309 09:08:18.514243 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:18 crc kubenswrapper[4792]: I0309 09:08:18.517062 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:18 crc kubenswrapper[4792]: I0309 09:08:18.517127 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:18 crc kubenswrapper[4792]: I0309 09:08:18.517142 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:18 crc kubenswrapper[4792]: I0309 09:08:18.517166 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:08:18 crc kubenswrapper[4792]: E0309 09:08:18.521490 4792 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:08:18 crc kubenswrapper[4792]: I0309 09:08:18.599615 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:19 crc kubenswrapper[4792]: I0309 09:08:19.598605 4792 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:08:20 crc kubenswrapper[4792]: I0309 09:08:20.046299 4792 csr.go:261] certificate signing request csr-pktxk is approved, waiting to be issued Mar 09 09:08:20 crc kubenswrapper[4792]: I0309 09:08:20.058995 4792 csr.go:257] certificate signing request csr-pktxk is issued Mar 09 09:08:20 crc kubenswrapper[4792]: I0309 09:08:20.177032 4792 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 09 09:08:20 crc kubenswrapper[4792]: I0309 09:08:20.448233 4792 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 09 09:08:21 crc kubenswrapper[4792]: I0309 09:08:21.060630 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-26 03:11:53.36748671 +0000 UTC Mar 09 09:08:21 crc kubenswrapper[4792]: I0309 09:08:21.060670 4792 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6282h3m32.306819027s for next certificate rotation Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.522377 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.524255 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.524305 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.524318 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.524440 4792 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.534915 4792 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.535474 4792 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 09 09:08:25 crc kubenswrapper[4792]: E0309 09:08:25.535567 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.538946 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.539246 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.539381 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.539484 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.539553 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:25Z","lastTransitionTime":"2026-03-09T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:25 crc kubenswrapper[4792]: E0309 09:08:25.553363 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.560336 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.560395 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.560421 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.560448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.560465 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:25Z","lastTransitionTime":"2026-03-09T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:25 crc kubenswrapper[4792]: E0309 09:08:25.572041 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.578749 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.578882 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.578948 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.579026 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.579110 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:25Z","lastTransitionTime":"2026-03-09T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:25 crc kubenswrapper[4792]: E0309 09:08:25.588133 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.593869 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.594036 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.594125 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.594214 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:25 crc kubenswrapper[4792]: I0309 09:08:25.594305 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:25Z","lastTransitionTime":"2026-03-09T09:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:25 crc kubenswrapper[4792]: E0309 09:08:25.603765 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:25 crc kubenswrapper[4792]: E0309 09:08:25.603867 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:08:25 crc kubenswrapper[4792]: E0309 09:08:25.603892 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:25 crc kubenswrapper[4792]: E0309 09:08:25.704648 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:25 crc kubenswrapper[4792]: E0309 09:08:25.760278 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:08:25 crc kubenswrapper[4792]: E0309 09:08:25.804775 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:25 crc kubenswrapper[4792]: E0309 09:08:25.904898 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:26 crc kubenswrapper[4792]: E0309 09:08:26.005558 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:26 crc kubenswrapper[4792]: E0309 09:08:26.105779 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:26 crc kubenswrapper[4792]: E0309 09:08:26.207223 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:26 crc kubenswrapper[4792]: E0309 09:08:26.307830 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:26 crc kubenswrapper[4792]: E0309 09:08:26.408614 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:26 crc kubenswrapper[4792]: E0309 09:08:26.509002 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:26 crc kubenswrapper[4792]: E0309 09:08:26.609641 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:26 crc kubenswrapper[4792]: I0309 09:08:26.662221 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:26 crc kubenswrapper[4792]: I0309 09:08:26.663372 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:26 crc kubenswrapper[4792]: I0309 09:08:26.663422 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:26 crc kubenswrapper[4792]: I0309 09:08:26.663440 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:26 crc kubenswrapper[4792]: I0309 09:08:26.664383 4792 scope.go:117] "RemoveContainer" containerID="62ff4c7382c6f35a7bcfd4126e382b650475955a51ae685f336bbc7620276ce8" Mar 09 09:08:26 crc kubenswrapper[4792]: E0309 09:08:26.709773 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:26 crc kubenswrapper[4792]: E0309 09:08:26.811036 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:26 crc kubenswrapper[4792]: E0309 09:08:26.911362 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:26 crc kubenswrapper[4792]: I0309 09:08:26.951460 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 09:08:26 crc kubenswrapper[4792]: I0309 09:08:26.952805 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770"} Mar 09 09:08:26 crc kubenswrapper[4792]: I0309 09:08:26.952910 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:26 crc kubenswrapper[4792]: I0309 09:08:26.953561 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:26 crc kubenswrapper[4792]: I0309 09:08:26.953596 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:26 crc kubenswrapper[4792]: I0309 09:08:26.953606 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:27 crc kubenswrapper[4792]: E0309 09:08:27.011767 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:27 crc kubenswrapper[4792]: E0309 09:08:27.112685 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:27 crc kubenswrapper[4792]: E0309 09:08:27.213748 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:27 crc kubenswrapper[4792]: E0309 09:08:27.314085 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:27 crc kubenswrapper[4792]: E0309 09:08:27.414707 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:27 crc kubenswrapper[4792]: E0309 09:08:27.515570 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:27 crc kubenswrapper[4792]: E0309 09:08:27.616145 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:27 crc kubenswrapper[4792]: E0309 09:08:27.716710 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:27 crc kubenswrapper[4792]: E0309 09:08:27.816906 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:27 crc kubenswrapper[4792]: E0309 09:08:27.917410 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:27 crc kubenswrapper[4792]: I0309 09:08:27.956951 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 09:08:27 crc kubenswrapper[4792]: I0309 09:08:27.957766 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 09:08:27 crc kubenswrapper[4792]: I0309 09:08:27.959878 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770" exitCode=255 Mar 09 09:08:27 crc kubenswrapper[4792]: I0309 09:08:27.959912 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770"} Mar 09 09:08:27 crc kubenswrapper[4792]: I0309 09:08:27.959950 4792 scope.go:117] "RemoveContainer" containerID="62ff4c7382c6f35a7bcfd4126e382b650475955a51ae685f336bbc7620276ce8" Mar 09 09:08:27 crc kubenswrapper[4792]: I0309 09:08:27.960165 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:27 crc kubenswrapper[4792]: I0309 09:08:27.961558 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:27 crc kubenswrapper[4792]: I0309 09:08:27.961620 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:27 crc kubenswrapper[4792]: I0309 09:08:27.961636 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:27 crc kubenswrapper[4792]: I0309 09:08:27.963937 4792 scope.go:117] "RemoveContainer" containerID="420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770" Mar 09 09:08:27 crc kubenswrapper[4792]: E0309 09:08:27.964263 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:08:28 crc kubenswrapper[4792]: E0309 09:08:28.018037 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:28 crc kubenswrapper[4792]: E0309 09:08:28.119200 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:28 crc kubenswrapper[4792]: E0309 09:08:28.219929 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:28 crc kubenswrapper[4792]: E0309 09:08:28.320746 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:28 crc kubenswrapper[4792]: E0309 09:08:28.421601 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:28 crc kubenswrapper[4792]: E0309 09:08:28.521967 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:28 crc kubenswrapper[4792]: E0309 09:08:28.622666 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:28 crc kubenswrapper[4792]: E0309 09:08:28.724911 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:28 crc kubenswrapper[4792]: E0309 09:08:28.825618 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:28 crc kubenswrapper[4792]: E0309 09:08:28.925849 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:28 crc kubenswrapper[4792]: I0309 09:08:28.963338 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 09:08:29 crc kubenswrapper[4792]: E0309 09:08:29.026475 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:29 crc kubenswrapper[4792]: E0309 09:08:29.127270 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:29 crc kubenswrapper[4792]: E0309 09:08:29.227663 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:29 crc kubenswrapper[4792]: E0309 09:08:29.327742 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:29 crc kubenswrapper[4792]: E0309 09:08:29.428764 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:29 crc kubenswrapper[4792]: E0309 09:08:29.529452 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:29 crc kubenswrapper[4792]: E0309 09:08:29.630330 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:29 crc kubenswrapper[4792]: E0309 09:08:29.731482 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:29 crc kubenswrapper[4792]: E0309 09:08:29.832487 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:29 crc kubenswrapper[4792]: E0309 09:08:29.933634 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:30 crc kubenswrapper[4792]: E0309 09:08:30.034275 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:30 crc kubenswrapper[4792]: E0309 09:08:30.134525 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:30 crc kubenswrapper[4792]: I0309 09:08:30.225523 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:08:30 crc kubenswrapper[4792]: I0309 09:08:30.225744 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:30 crc kubenswrapper[4792]: I0309 09:08:30.227045 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:30 crc kubenswrapper[4792]: I0309 09:08:30.227225 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:30 crc kubenswrapper[4792]: I0309 09:08:30.227341 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:30 crc kubenswrapper[4792]: I0309 09:08:30.228171 4792 scope.go:117] "RemoveContainer" containerID="420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770" Mar 09 09:08:30 crc kubenswrapper[4792]: E0309 09:08:30.228470 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:08:30 crc kubenswrapper[4792]: E0309 09:08:30.235337 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:30 crc kubenswrapper[4792]: E0309 09:08:30.336256 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:30 crc kubenswrapper[4792]: E0309 09:08:30.437132 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:30 crc kubenswrapper[4792]: E0309 09:08:30.537716 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:30 crc kubenswrapper[4792]: E0309 09:08:30.638430 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:30 crc kubenswrapper[4792]: E0309 09:08:30.739470 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:30 crc kubenswrapper[4792]: E0309 09:08:30.840163 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:30 crc kubenswrapper[4792]: E0309 09:08:30.940759 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:31 crc kubenswrapper[4792]: E0309 09:08:31.041395 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:31 crc kubenswrapper[4792]: E0309 09:08:31.142380 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:31 crc kubenswrapper[4792]: E0309 09:08:31.243448 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:31 crc kubenswrapper[4792]: E0309 09:08:31.343596 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:31 crc kubenswrapper[4792]: E0309 09:08:31.444550 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:31 crc kubenswrapper[4792]: E0309 09:08:31.545542 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:31 crc kubenswrapper[4792]: E0309 09:08:31.646698 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:31 crc kubenswrapper[4792]: E0309 09:08:31.748667 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:31 crc kubenswrapper[4792]: E0309 09:08:31.850283 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:31 crc kubenswrapper[4792]: E0309 09:08:31.951454 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:32 crc kubenswrapper[4792]: E0309 09:08:32.052464 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:32 crc kubenswrapper[4792]: E0309 09:08:32.152662 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:32 crc kubenswrapper[4792]: E0309 09:08:32.253130 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:32 crc kubenswrapper[4792]: E0309 09:08:32.354270 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:32 crc kubenswrapper[4792]: E0309 09:08:32.455399 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:32 crc kubenswrapper[4792]: E0309 09:08:32.556151 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:32 crc kubenswrapper[4792]: E0309 09:08:32.657149 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:32 crc kubenswrapper[4792]: E0309 09:08:32.757446 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:32 crc kubenswrapper[4792]: E0309 09:08:32.858202 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:32 crc kubenswrapper[4792]: E0309 09:08:32.959046 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:33 crc kubenswrapper[4792]: E0309 09:08:33.060221 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:33 crc kubenswrapper[4792]: E0309 09:08:33.160901 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:33 crc kubenswrapper[4792]: E0309 09:08:33.261177 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:33 crc kubenswrapper[4792]: E0309 09:08:33.361763 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:33 crc kubenswrapper[4792]: E0309 09:08:33.462233 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:33 crc kubenswrapper[4792]: E0309 09:08:33.562846 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:33 crc kubenswrapper[4792]: E0309 09:08:33.663875 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:33 crc kubenswrapper[4792]: I0309 09:08:33.717934 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:08:33 crc kubenswrapper[4792]: I0309 09:08:33.718321 4792 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:08:33 crc kubenswrapper[4792]: I0309 09:08:33.720269 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:33 crc kubenswrapper[4792]: I0309 09:08:33.720332 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:33 crc kubenswrapper[4792]: I0309 09:08:33.720344 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:33 crc kubenswrapper[4792]: I0309 09:08:33.721320 4792 scope.go:117] "RemoveContainer" containerID="420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770" Mar 09 09:08:33 crc kubenswrapper[4792]: E0309 09:08:33.721546 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:08:33 crc kubenswrapper[4792]: E0309 09:08:33.764102 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:33 crc kubenswrapper[4792]: E0309 09:08:33.864923 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:33 crc kubenswrapper[4792]: E0309 09:08:33.966096 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:34 crc kubenswrapper[4792]: E0309 09:08:34.067136 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:34 crc kubenswrapper[4792]: E0309 09:08:34.167321 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:34 crc kubenswrapper[4792]: E0309 09:08:34.268660 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:34 crc kubenswrapper[4792]: E0309 09:08:34.369660 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:34 crc kubenswrapper[4792]: E0309 09:08:34.470789 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:34 crc kubenswrapper[4792]: E0309 09:08:34.571489 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:34 crc kubenswrapper[4792]: E0309 09:08:34.672527 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:34 crc kubenswrapper[4792]: E0309 09:08:34.773560 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:34 crc kubenswrapper[4792]: E0309 09:08:34.874039 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:34 crc kubenswrapper[4792]: E0309 09:08:34.975149 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:35 crc kubenswrapper[4792]: E0309 09:08:35.075624 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:35 crc kubenswrapper[4792]: E0309 09:08:35.176762 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:35 crc kubenswrapper[4792]: E0309 09:08:35.277400 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.322975 4792 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 09:08:35 crc kubenswrapper[4792]: E0309 09:08:35.378435 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:35 crc kubenswrapper[4792]: E0309 09:08:35.479376 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:35 crc kubenswrapper[4792]: E0309 09:08:35.579670 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:35 crc kubenswrapper[4792]: E0309 09:08:35.679820 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:35 crc kubenswrapper[4792]: E0309 09:08:35.760471 4792 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:08:35 crc kubenswrapper[4792]: E0309 09:08:35.780983 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:35 crc kubenswrapper[4792]: E0309 09:08:35.881439 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:35 crc kubenswrapper[4792]: E0309 09:08:35.945090 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.951988 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.952034 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.952044 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.952059 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.952094 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:35Z","lastTransitionTime":"2026-03-09T09:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:35 crc kubenswrapper[4792]: E0309 09:08:35.968307 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.974257 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.974302 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.974317 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.974335 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.974346 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:35Z","lastTransitionTime":"2026-03-09T09:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:35 crc kubenswrapper[4792]: E0309 09:08:35.993927 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.999133 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.999220 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.999243 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.999277 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:35 crc kubenswrapper[4792]: I0309 09:08:35.999300 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:35Z","lastTransitionTime":"2026-03-09T09:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:36 crc kubenswrapper[4792]: E0309 09:08:36.016587 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:36 crc kubenswrapper[4792]: I0309 09:08:36.021521 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:36 crc kubenswrapper[4792]: I0309 09:08:36.021571 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:36 crc kubenswrapper[4792]: I0309 09:08:36.021582 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:36 crc kubenswrapper[4792]: I0309 09:08:36.021602 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:36 crc kubenswrapper[4792]: I0309 09:08:36.021614 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:36Z","lastTransitionTime":"2026-03-09T09:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:36 crc kubenswrapper[4792]: E0309 09:08:36.036362 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:36 crc kubenswrapper[4792]: E0309 09:08:36.036614 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:08:36 crc kubenswrapper[4792]: E0309 09:08:36.036638 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:36 crc kubenswrapper[4792]: E0309 09:08:36.136797 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:36 crc kubenswrapper[4792]: I0309 09:08:36.136851 4792 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 09:08:36 crc kubenswrapper[4792]: E0309 09:08:36.238101 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:36 crc kubenswrapper[4792]: E0309 09:08:36.338888 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:36 crc kubenswrapper[4792]: E0309 09:08:36.439444 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:36 crc kubenswrapper[4792]: E0309 09:08:36.539779 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:36 crc kubenswrapper[4792]: E0309 09:08:36.639977 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:36 crc kubenswrapper[4792]: E0309 09:08:36.741267 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:36 crc kubenswrapper[4792]: E0309 09:08:36.842929 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:36 crc kubenswrapper[4792]: E0309 09:08:36.943912 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:37 crc kubenswrapper[4792]: E0309 09:08:37.044238 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:37 crc kubenswrapper[4792]: E0309 09:08:37.145220 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:37 crc kubenswrapper[4792]: E0309 09:08:37.246393 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:37 crc kubenswrapper[4792]: E0309 09:08:37.346961 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:37 crc kubenswrapper[4792]: E0309 09:08:37.447959 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:37 crc kubenswrapper[4792]: E0309 09:08:37.548525 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:37 crc kubenswrapper[4792]: E0309 09:08:37.649613 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:37 crc kubenswrapper[4792]: E0309 09:08:37.750184 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:37 crc kubenswrapper[4792]: E0309 09:08:37.851372 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:37 crc kubenswrapper[4792]: E0309 09:08:37.952814 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:38 crc kubenswrapper[4792]: E0309 09:08:38.053290 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:38 crc kubenswrapper[4792]: E0309 09:08:38.154106 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:38 crc kubenswrapper[4792]: E0309 09:08:38.255030 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:38 crc kubenswrapper[4792]: E0309 09:08:38.355665 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:38 crc kubenswrapper[4792]: E0309 09:08:38.456791 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:38 crc kubenswrapper[4792]: E0309 09:08:38.556968 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:38 crc kubenswrapper[4792]: E0309 09:08:38.657994 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:38 crc kubenswrapper[4792]: E0309 09:08:38.758385 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:38 crc kubenswrapper[4792]: E0309 09:08:38.858648 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:38 crc kubenswrapper[4792]: E0309 09:08:38.959438 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:39 crc kubenswrapper[4792]: E0309 09:08:39.060057 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:39 crc kubenswrapper[4792]: E0309 09:08:39.161206 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:39 crc kubenswrapper[4792]: E0309 09:08:39.261976 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:39 crc kubenswrapper[4792]: E0309 09:08:39.362890 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:39 crc kubenswrapper[4792]: E0309 09:08:39.463851 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:39 crc kubenswrapper[4792]: I0309 09:08:39.495733 4792 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 09:08:39 crc kubenswrapper[4792]: E0309 09:08:39.564812 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:39 crc kubenswrapper[4792]: E0309 09:08:39.665138 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:39 crc kubenswrapper[4792]: E0309 09:08:39.766290 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:39 crc kubenswrapper[4792]: E0309 09:08:39.866656 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:39 crc kubenswrapper[4792]: E0309 09:08:39.966779 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:40 crc kubenswrapper[4792]: E0309 09:08:40.067333 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:40 crc kubenswrapper[4792]: E0309 09:08:40.167696 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:40 crc kubenswrapper[4792]: E0309 09:08:40.268804 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:40 crc kubenswrapper[4792]: E0309 09:08:40.369641 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:40 crc kubenswrapper[4792]: E0309 09:08:40.470347 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:40 crc kubenswrapper[4792]: E0309 09:08:40.571391 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:40 crc kubenswrapper[4792]: E0309 09:08:40.672161 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:40 crc kubenswrapper[4792]: E0309 09:08:40.773158 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:40 crc kubenswrapper[4792]: E0309 09:08:40.873645 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:40 crc kubenswrapper[4792]: E0309 09:08:40.974694 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:41 crc kubenswrapper[4792]: E0309 09:08:41.075108 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:41 crc kubenswrapper[4792]: E0309 09:08:41.175283 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:41 crc kubenswrapper[4792]: E0309 09:08:41.276082 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:41 crc kubenswrapper[4792]: E0309 09:08:41.377003 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:41 crc kubenswrapper[4792]: E0309 09:08:41.477981 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:41 crc kubenswrapper[4792]: E0309 09:08:41.579116 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:41 crc kubenswrapper[4792]: E0309 09:08:41.679548 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:41 crc kubenswrapper[4792]: E0309 09:08:41.780779 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:41 crc kubenswrapper[4792]: E0309 09:08:41.881122 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:41 crc kubenswrapper[4792]: E0309 09:08:41.982174 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.082923 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.183117 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.283693 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.384429 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.485312 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.586056 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.687204 4792 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.703500 4792 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.789533 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.789571 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.789581 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.789607 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.789619 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:42Z","lastTransitionTime":"2026-03-09T09:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.843464 4792 apiserver.go:52] "Watching apiserver" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.848754 4792 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.849115 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-97tth","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p","openshift-multus/multus-additional-cni-plugins-4tprh","openshift-multus/multus-vgtc9","openshift-network-node-identity/network-node-identity-vrzqb","openshift-dns/node-resolver-k4kdn","openshift-image-registry/node-ca-fgk47","openshift-multus/network-metrics-daemon-fttpc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-target-xd92c","openshift-ovn-kubernetes/ovnkube-node-lfm2j"] Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.849437 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.849529 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.849490 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.849450 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.849894 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fgk47" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.849976 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k4kdn" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.850026 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.849996 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.850233 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.850250 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.850317 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.850378 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.850770 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.850976 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.850877 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.850794 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.851474 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.850892 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.852355 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.852366 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.854593 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.856348 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.857598 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.857778 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.858154 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.858582 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.858864 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.858890 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.859346 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.859481 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.859605 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.859790 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.859876 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.860021 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.860184 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.860391 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.860527 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.860723 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.860827 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.860932 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.861021 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.861129 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.861391 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.861428 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.861592 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.861777 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.861939 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.862035 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.862140 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.862231 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.862321 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.862375 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.862547 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.862662 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.862882 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.874262 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.883617 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.893131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.893158 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.893167 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.893179 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.893188 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:42Z","lastTransitionTime":"2026-03-09T09:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.894790 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.897826 4792 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.904314 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.913424 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.924687 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.933047 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.941885 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.952959 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955361 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955393 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955413 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955430 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955445 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955461 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955477 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955492 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955511 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955526 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955541 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955556 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955573 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955587 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955602 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955618 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955635 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955651 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955669 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955683 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955699 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955715 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955732 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955758 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955775 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955790 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955805 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955821 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955837 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955853 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955871 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955888 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955904 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955919 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955937 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955953 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955967 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.955983 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956003 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956026 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956047 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956063 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956094 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956116 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956137 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956160 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956182 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956203 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956227 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956249 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956268 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956289 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956310 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956329 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956348 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956368 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956389 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956407 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956400 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956426 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956447 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956465 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956489 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956511 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956534 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956559 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956580 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956600 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956621 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956639 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956660 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956681 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956699 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956719 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956738 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956757 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956779 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956799 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956819 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956837 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956855 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956879 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956900 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956922 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956944 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956965 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956970 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.956987 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957008 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957032 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957056 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957096 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957195 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957218 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957234 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957251 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957267 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957282 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957301 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957316 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957332 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957346 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957362 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957377 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957393 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957410 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957425 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957440 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957455 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957471 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957485 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957501 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957516 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957532 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957549 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957564 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957588 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957603 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957618 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957633 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957650 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957665 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957682 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957700 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957717 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957732 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957748 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957763 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957779 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957793 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957810 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957826 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957841 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957857 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957876 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957893 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957908 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957924 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957939 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957955 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957975 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.957991 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958007 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958022 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958038 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958054 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958092 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958112 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958133 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958156 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958175 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958191 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958209 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958225 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958243 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958259 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958274 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958290 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958310 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958332 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958330 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958357 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958381 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958407 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958428 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958450 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958472 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958495 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958519 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958538 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958548 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958554 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958590 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958610 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958627 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958644 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958660 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958675 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958691 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958708 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958723 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958743 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958767 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958790 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958813 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958834 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958853 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958878 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958900 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958920 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958942 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958964 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.958988 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959012 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959036 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959061 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959138 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959165 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959188 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959276 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-cni-netd\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959549 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959577 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd11045a-d746-4b42-872c-8b8d1dd2d515-proxy-tls\") pod \"machine-config-daemon-97tth\" (UID: \"bd11045a-d746-4b42-872c-8b8d1dd2d515\") " pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959597 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-var-lib-openvswitch\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959621 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-kubelet\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959636 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-systemd-units\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959652 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b59ff3b-540d-4385-b02b-f68349bb74bf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959671 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959691 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpkhp\" (UniqueName: \"kubernetes.io/projected/84b6eb44-ca33-41a6-a951-2c66688ad860-kube-api-access-cpkhp\") pod \"node-ca-fgk47\" (UID: \"84b6eb44-ca33-41a6-a951-2c66688ad860\") " pod="openshift-image-registry/node-ca-fgk47" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959707 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-log-socket\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959721 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84b6eb44-ca33-41a6-a951-2c66688ad860-host\") pod \"node-ca-fgk47\" (UID: \"84b6eb44-ca33-41a6-a951-2c66688ad860\") " pod="openshift-image-registry/node-ca-fgk47" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959736 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-run-netns\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959751 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-var-lib-cni-bin\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959766 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959783 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b59ff3b-540d-4385-b02b-f68349bb74bf-cnibin\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959801 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959815 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-ovn\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959830 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-hostroot\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959845 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd11045a-d746-4b42-872c-8b8d1dd2d515-mcd-auth-proxy-config\") pod \"machine-config-daemon-97tth\" (UID: \"bd11045a-d746-4b42-872c-8b8d1dd2d515\") " pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959862 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdqpk\" (UniqueName: \"kubernetes.io/projected/bd11045a-d746-4b42-872c-8b8d1dd2d515-kube-api-access-zdqpk\") pod \"machine-config-daemon-97tth\" (UID: \"bd11045a-d746-4b42-872c-8b8d1dd2d515\") " pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959881 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959915 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-cnibin\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959931 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-run-netns\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959946 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/626ea896-2e5c-4478-a7be-34a19acc242d-multus-daemon-config\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959961 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b59ff3b-540d-4385-b02b-f68349bb74bf-os-release\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959976 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2b59ff3b-540d-4385-b02b-f68349bb74bf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.959994 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovn-node-metrics-cert\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960052 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b59ff3b-540d-4385-b02b-f68349bb74bf-cni-binary-copy\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960100 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960119 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovnkube-script-lib\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960137 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-multus-conf-dir\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960158 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-multus-socket-dir-parent\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960202 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-etc-kubernetes\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960220 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960240 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960261 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bd11045a-d746-4b42-872c-8b8d1dd2d515-rootfs\") pod \"machine-config-daemon-97tth\" (UID: \"bd11045a-d746-4b42-872c-8b8d1dd2d515\") " pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960289 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0cb7634b-66b7-4541-8e53-3e01a6cb41ca-hosts-file\") pod \"node-resolver-k4kdn\" (UID: \"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\") " pod="openshift-dns/node-resolver-k4kdn" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960307 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-node-log\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960323 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmxpq\" (UniqueName: \"kubernetes.io/projected/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-kube-api-access-gmxpq\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960338 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-var-lib-kubelet\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960354 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/84b6eb44-ca33-41a6-a951-2c66688ad860-serviceca\") pod \"node-ca-fgk47\" (UID: \"84b6eb44-ca33-41a6-a951-2c66688ad860\") " pod="openshift-image-registry/node-ca-fgk47" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960369 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ceed39a1-2e4f-4630-bda0-57071ac26ee4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ssk9p\" (UID: \"ceed39a1-2e4f-4630-bda0-57071ac26ee4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960384 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ceed39a1-2e4f-4630-bda0-57071ac26ee4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ssk9p\" (UID: \"ceed39a1-2e4f-4630-bda0-57071ac26ee4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960400 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ceed39a1-2e4f-4630-bda0-57071ac26ee4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ssk9p\" (UID: \"ceed39a1-2e4f-4630-bda0-57071ac26ee4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960420 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-os-release\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960437 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tn4n\" (UniqueName: \"kubernetes.io/projected/ceed39a1-2e4f-4630-bda0-57071ac26ee4-kube-api-access-7tn4n\") pod \"ovnkube-control-plane-749d76644c-ssk9p\" (UID: \"ceed39a1-2e4f-4630-bda0-57071ac26ee4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960452 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-etc-openvswitch\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960467 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b59ff3b-540d-4385-b02b-f68349bb74bf-system-cni-dir\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960483 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-var-lib-cni-multus\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960504 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-run-multus-certs\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960525 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-openvswitch\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960546 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960571 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovnkube-config\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960589 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/626ea896-2e5c-4478-a7be-34a19acc242d-cni-binary-copy\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960609 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960627 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960644 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4bnr\" (UniqueName: \"kubernetes.io/projected/2b59ff3b-540d-4385-b02b-f68349bb74bf-kube-api-access-d4bnr\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960662 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjrbm\" (UniqueName: \"kubernetes.io/projected/4711cce5-88a9-48c4-8e2e-522062e34a03-kube-api-access-cjrbm\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960679 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-systemd\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960698 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-cni-bin\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960715 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-env-overrides\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960731 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-multus-cni-dir\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960748 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960765 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-slash\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960781 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-run-k8s-cni-cncf-io\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960797 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rw87\" (UniqueName: \"kubernetes.io/projected/626ea896-2e5c-4478-a7be-34a19acc242d-kube-api-access-4rw87\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960812 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960848 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960866 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh5jf\" (UniqueName: \"kubernetes.io/projected/0cb7634b-66b7-4541-8e53-3e01a6cb41ca-kube-api-access-vh5jf\") pod \"node-resolver-k4kdn\" (UID: \"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\") " pod="openshift-dns/node-resolver-k4kdn" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960881 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-system-cni-dir\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960921 4792 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960933 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.960944 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.961404 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.961537 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.965146 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.965872 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.966014 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.966174 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.967278 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.967382 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.967328 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.967490 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.967514 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.967603 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.967696 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.967706 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.967747 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.968004 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.968124 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.968205 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.968388 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.968397 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.968476 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:43.468455848 +0000 UTC m=+88.498656600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.968840 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.968963 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.969232 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.969405 4792 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.969556 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.970352 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.970546 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.970662 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.970709 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:43.470693774 +0000 UTC m=+88.500894526 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.970883 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.971355 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.972376 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.973210 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.973586 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.973631 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.973878 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.974011 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.974061 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.974164 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.974181 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.974389 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.974598 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.974748 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.974764 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.974835 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.975732 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.976119 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.976279 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.976414 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.976420 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.977016 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.977117 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.977340 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.977557 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.978013 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.978226 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.979056 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.979609 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.979921 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.980900 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.981229 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.981492 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.981749 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.982011 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.982117 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.982234 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.982274 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.982279 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.982296 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.982307 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.982583 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.982663 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.982834 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.983269 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.983305 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.986480 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.986501 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.986511 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:42 crc kubenswrapper[4792]: E0309 09:08:42.986555 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:43.486542017 +0000 UTC m=+88.516742769 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.989421 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.989703 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.989820 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.990514 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.990686 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.991764 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:42 crc kubenswrapper[4792]: I0309 09:08:42.992054 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.997152 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.997499 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:42.997642 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:42.997659 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:42.997670 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:42.997709 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:43.497695494 +0000 UTC m=+88.527896236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.998172 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.998409 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.998440 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.998619 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.998672 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.998835 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.999221 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.999290 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.999494 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.999511 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.999518 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.999531 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.999539 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:42Z","lastTransitionTime":"2026-03-09T09:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.999562 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.999631 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.999651 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.999866 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:42.999934 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.000253 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.000482 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.000778 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.001090 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.001309 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.002689 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.002891 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.002964 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.003118 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:08:43.503022576 +0000 UTC m=+88.533223338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.003206 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.003663 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.003694 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.004190 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.004264 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.004580 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.004606 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.004818 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.005016 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.005249 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.005677 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.005903 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.006197 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.006475 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.006651 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.006872 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.007029 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.007739 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.008108 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.009756 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.011133 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.012080 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.012122 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.012158 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.012179 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.012453 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.012459 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.012681 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.014150 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.018044 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.018498 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.019312 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.020491 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.020501 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.020542 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.020676 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.021600 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.021615 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.021651 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.021788 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.021887 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.021928 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.021959 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.022039 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.022120 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.023263 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.024739 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.025308 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.025514 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.025584 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.025613 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.025514 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.025695 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.025965 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.026083 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.026143 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.025863 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.026155 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.026347 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.026434 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.026522 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.026614 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.026632 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.026752 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.026853 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.026852 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.026884 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.027244 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.027325 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.027423 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.027449 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.027473 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.027653 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.028347 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.028390 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.028490 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.028517 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.028541 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.028552 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.028594 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.028610 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.028710 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.028906 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.029086 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.029442 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.029094 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.030249 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.030525 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.030764 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.031369 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.031398 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.032221 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.033903 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.033934 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.033997 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.036896 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.044256 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.052851 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.059377 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.059728 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.064348 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.067244 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.068528 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069152 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84b6eb44-ca33-41a6-a951-2c66688ad860-host\") pod \"node-ca-fgk47\" (UID: \"84b6eb44-ca33-41a6-a951-2c66688ad860\") " pod="openshift-image-registry/node-ca-fgk47" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069218 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84b6eb44-ca33-41a6-a951-2c66688ad860-host\") pod \"node-ca-fgk47\" (UID: \"84b6eb44-ca33-41a6-a951-2c66688ad860\") " pod="openshift-image-registry/node-ca-fgk47" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069249 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-run-netns\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-run-netns\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069309 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-var-lib-cni-bin\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069341 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-var-lib-cni-bin\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069396 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069420 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b59ff3b-540d-4385-b02b-f68349bb74bf-cnibin\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069449 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b59ff3b-540d-4385-b02b-f68349bb74bf-cnibin\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069480 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-ovn\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-hostroot\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069526 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd11045a-d746-4b42-872c-8b8d1dd2d515-mcd-auth-proxy-config\") pod \"machine-config-daemon-97tth\" (UID: \"bd11045a-d746-4b42-872c-8b8d1dd2d515\") " pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069547 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdqpk\" (UniqueName: \"kubernetes.io/projected/bd11045a-d746-4b42-872c-8b8d1dd2d515-kube-api-access-zdqpk\") pod \"machine-config-daemon-97tth\" (UID: \"bd11045a-d746-4b42-872c-8b8d1dd2d515\") " pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069579 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-cnibin\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069600 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-run-netns\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069706 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/626ea896-2e5c-4478-a7be-34a19acc242d-multus-daemon-config\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069805 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b59ff3b-540d-4385-b02b-f68349bb74bf-os-release\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069836 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2b59ff3b-540d-4385-b02b-f68349bb74bf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069860 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovn-node-metrics-cert\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069895 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b59ff3b-540d-4385-b02b-f68349bb74bf-cni-binary-copy\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069916 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovnkube-script-lib\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069938 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-multus-conf-dir\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.069983 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-multus-socket-dir-parent\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070004 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-etc-kubernetes\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070028 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070050 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bd11045a-d746-4b42-872c-8b8d1dd2d515-rootfs\") pod \"machine-config-daemon-97tth\" (UID: \"bd11045a-d746-4b42-872c-8b8d1dd2d515\") " pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070089 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0cb7634b-66b7-4541-8e53-3e01a6cb41ca-hosts-file\") pod \"node-resolver-k4kdn\" (UID: \"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\") " pod="openshift-dns/node-resolver-k4kdn" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070111 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-node-log\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070133 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmxpq\" (UniqueName: \"kubernetes.io/projected/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-kube-api-access-gmxpq\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070153 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-var-lib-kubelet\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070174 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/84b6eb44-ca33-41a6-a951-2c66688ad860-serviceca\") pod \"node-ca-fgk47\" (UID: \"84b6eb44-ca33-41a6-a951-2c66688ad860\") " pod="openshift-image-registry/node-ca-fgk47" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070196 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ceed39a1-2e4f-4630-bda0-57071ac26ee4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ssk9p\" (UID: \"ceed39a1-2e4f-4630-bda0-57071ac26ee4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070218 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ceed39a1-2e4f-4630-bda0-57071ac26ee4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ssk9p\" (UID: \"ceed39a1-2e4f-4630-bda0-57071ac26ee4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070240 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ceed39a1-2e4f-4630-bda0-57071ac26ee4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ssk9p\" (UID: \"ceed39a1-2e4f-4630-bda0-57071ac26ee4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070263 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-os-release\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070283 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tn4n\" (UniqueName: \"kubernetes.io/projected/ceed39a1-2e4f-4630-bda0-57071ac26ee4-kube-api-access-7tn4n\") pod \"ovnkube-control-plane-749d76644c-ssk9p\" (UID: \"ceed39a1-2e4f-4630-bda0-57071ac26ee4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070304 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-etc-openvswitch\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070325 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b59ff3b-540d-4385-b02b-f68349bb74bf-system-cni-dir\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070347 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-var-lib-cni-multus\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070367 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-run-multus-certs\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070388 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-openvswitch\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070408 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070428 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovnkube-config\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070447 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/626ea896-2e5c-4478-a7be-34a19acc242d-cni-binary-copy\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070493 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4bnr\" (UniqueName: \"kubernetes.io/projected/2b59ff3b-540d-4385-b02b-f68349bb74bf-kube-api-access-d4bnr\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070519 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjrbm\" (UniqueName: \"kubernetes.io/projected/4711cce5-88a9-48c4-8e2e-522062e34a03-kube-api-access-cjrbm\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070539 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-systemd\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070558 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-cni-bin\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070578 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-env-overrides\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070600 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-multus-cni-dir\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070620 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-slash\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070641 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-run-k8s-cni-cncf-io\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070661 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rw87\" (UniqueName: \"kubernetes.io/projected/626ea896-2e5c-4478-a7be-34a19acc242d-kube-api-access-4rw87\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070705 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070727 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh5jf\" (UniqueName: \"kubernetes.io/projected/0cb7634b-66b7-4541-8e53-3e01a6cb41ca-kube-api-access-vh5jf\") pod \"node-resolver-k4kdn\" (UID: \"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\") " pod="openshift-dns/node-resolver-k4kdn" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070749 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-system-cni-dir\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070769 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-cni-netd\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070788 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd11045a-d746-4b42-872c-8b8d1dd2d515-proxy-tls\") pod \"machine-config-daemon-97tth\" (UID: \"bd11045a-d746-4b42-872c-8b8d1dd2d515\") " pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070810 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-var-lib-openvswitch\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070836 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-kubelet\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070860 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-systemd-units\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070886 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b59ff3b-540d-4385-b02b-f68349bb74bf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070921 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpkhp\" (UniqueName: \"kubernetes.io/projected/84b6eb44-ca33-41a6-a951-2c66688ad860-kube-api-access-cpkhp\") pod \"node-ca-fgk47\" (UID: \"84b6eb44-ca33-41a6-a951-2c66688ad860\") " pod="openshift-image-registry/node-ca-fgk47" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.070942 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-log-socket\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071025 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071041 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071054 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071082 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071096 4792 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071107 4792 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071117 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071129 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071140 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071152 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071154 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-cni-bin\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071165 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071169 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-ovn\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071180 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071203 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-log-socket\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071211 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071225 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071240 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071253 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071266 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071277 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071289 4792 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071301 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071312 4792 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071354 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071365 4792 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071376 4792 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071388 4792 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071400 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071410 4792 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071421 4792 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071433 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071446 4792 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071458 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071469 4792 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071481 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071492 4792 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071502 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071513 4792 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071545 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071559 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071571 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071600 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071614 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071625 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071636 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071647 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071658 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071671 4792 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071682 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071694 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071708 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071720 4792 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071734 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071745 4792 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071757 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071768 4792 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071779 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071791 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071800 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ceed39a1-2e4f-4630-bda0-57071ac26ee4-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ssk9p\" (UID: \"ceed39a1-2e4f-4630-bda0-57071ac26ee4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071810 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-env-overrides\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071835 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-hostroot\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071863 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-slash\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071901 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-run-k8s-cni-cncf-io\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071957 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-multus-cni-dir\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.072180 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.072217 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.072380 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-system-cni-dir\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.072409 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-cni-netd\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.073261 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-var-lib-openvswitch\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.073312 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-kubelet\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.073337 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-systemd-units\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.073447 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ceed39a1-2e4f-4630-bda0-57071ac26ee4-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ssk9p\" (UID: \"ceed39a1-2e4f-4630-bda0-57071ac26ee4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.073662 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-run-multus-certs\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.073731 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-openvswitch\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.074034 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-run-ovn-kubernetes\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.076423 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd11045a-d746-4b42-872c-8b8d1dd2d515-mcd-auth-proxy-config\") pod \"machine-config-daemon-97tth\" (UID: \"bd11045a-d746-4b42-872c-8b8d1dd2d515\") " pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.076465 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovnkube-script-lib\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.077092 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.077195 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/84b6eb44-ca33-41a6-a951-2c66688ad860-serviceca\") pod \"node-ca-fgk47\" (UID: \"84b6eb44-ca33-41a6-a951-2c66688ad860\") " pod="openshift-image-registry/node-ca-fgk47" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.077355 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.077574 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-systemd\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.077625 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-multus-conf-dir\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.077681 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-cnibin\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.077727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-run-netns\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.078020 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-multus-socket-dir-parent\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.078204 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.078324 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs podName:4711cce5-88a9-48c4-8e2e-522062e34a03 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:43.578305404 +0000 UTC m=+88.608506256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs") pod "network-metrics-daemon-fttpc" (UID: "4711cce5-88a9-48c4-8e2e-522062e34a03") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.078423 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-var-lib-cni-multus\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.078516 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b59ff3b-540d-4385-b02b-f68349bb74bf-system-cni-dir\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.080046 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/626ea896-2e5c-4478-a7be-34a19acc242d-cni-binary-copy\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.081874 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-node-log\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.081997 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0cb7634b-66b7-4541-8e53-3e01a6cb41ca-hosts-file\") pod \"node-resolver-k4kdn\" (UID: \"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\") " pod="openshift-dns/node-resolver-k4kdn" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.071804 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088328 4792 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088339 4792 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088350 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088360 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088443 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088455 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088465 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088474 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088483 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088493 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088503 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088512 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088521 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088530 4792 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088539 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088548 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088557 4792 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088566 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088574 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088586 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088596 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088606 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088614 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088623 4792 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088633 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088642 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088650 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088659 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088668 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088679 4792 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088688 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088696 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088704 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088712 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088721 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088729 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088739 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088748 4792 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088757 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088765 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088773 4792 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088783 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088791 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088800 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088810 4792 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088819 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088827 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088835 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088844 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088852 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088860 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088868 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088876 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088885 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088894 4792 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088903 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088912 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088920 4792 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088927 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088943 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088951 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088960 4792 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088968 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088978 4792 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088986 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088994 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089002 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089010 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089018 4792 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089027 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089034 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089042 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089052 4792 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089059 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089086 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089096 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089105 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089120 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089128 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089137 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089145 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089153 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089161 4792 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089169 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089178 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089186 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089195 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089203 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089210 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089220 4792 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089228 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089236 4792 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089244 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089252 4792 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089260 4792 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089268 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089276 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089284 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089292 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089300 4792 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089309 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089317 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089324 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089332 4792 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089339 4792 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089347 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089355 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089363 4792 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089372 4792 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089380 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089387 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089396 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089405 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089413 4792 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089422 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089431 4792 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089439 4792 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089447 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089454 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089462 4792 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089472 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089480 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089488 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089497 4792 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089505 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089515 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089523 4792 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089531 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089542 4792 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089550 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089559 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.089567 4792 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.088118 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/626ea896-2e5c-4478-a7be-34a19acc242d-multus-daemon-config\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.082019 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-etc-kubernetes\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.082036 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-host-var-lib-kubelet\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.090105 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd11045a-d746-4b42-872c-8b8d1dd2d515-proxy-tls\") pod \"machine-config-daemon-97tth\" (UID: \"bd11045a-d746-4b42-872c-8b8d1dd2d515\") " pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.090502 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh5jf\" (UniqueName: \"kubernetes.io/projected/0cb7634b-66b7-4541-8e53-3e01a6cb41ca-kube-api-access-vh5jf\") pod \"node-resolver-k4kdn\" (UID: \"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\") " pod="openshift-dns/node-resolver-k4kdn" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.083505 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b59ff3b-540d-4385-b02b-f68349bb74bf-os-release\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.082901 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovnkube-config\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.082937 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/626ea896-2e5c-4478-a7be-34a19acc242d-os-release\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.082965 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-etc-openvswitch\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.083459 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b59ff3b-540d-4385-b02b-f68349bb74bf-cni-binary-copy\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.084508 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2b59ff3b-540d-4385-b02b-f68349bb74bf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.083582 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bd11045a-d746-4b42-872c-8b8d1dd2d515-rootfs\") pod \"machine-config-daemon-97tth\" (UID: \"bd11045a-d746-4b42-872c-8b8d1dd2d515\") " pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.084853 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ceed39a1-2e4f-4630-bda0-57071ac26ee4-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ssk9p\" (UID: \"ceed39a1-2e4f-4630-bda0-57071ac26ee4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.087537 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.092391 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjrbm\" (UniqueName: \"kubernetes.io/projected/4711cce5-88a9-48c4-8e2e-522062e34a03-kube-api-access-cjrbm\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.099296 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpkhp\" (UniqueName: \"kubernetes.io/projected/84b6eb44-ca33-41a6-a951-2c66688ad860-kube-api-access-cpkhp\") pod \"node-ca-fgk47\" (UID: \"84b6eb44-ca33-41a6-a951-2c66688ad860\") " pod="openshift-image-registry/node-ca-fgk47" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.099759 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rw87\" (UniqueName: \"kubernetes.io/projected/626ea896-2e5c-4478-a7be-34a19acc242d-kube-api-access-4rw87\") pod \"multus-vgtc9\" (UID: \"626ea896-2e5c-4478-a7be-34a19acc242d\") " pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.100649 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovn-node-metrics-cert\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.101209 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tn4n\" (UniqueName: \"kubernetes.io/projected/ceed39a1-2e4f-4630-bda0-57071ac26ee4-kube-api-access-7tn4n\") pod \"ovnkube-control-plane-749d76644c-ssk9p\" (UID: \"ceed39a1-2e4f-4630-bda0-57071ac26ee4\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.102573 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.102600 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.102609 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.102612 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmxpq\" (UniqueName: \"kubernetes.io/projected/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-kube-api-access-gmxpq\") pod \"ovnkube-node-lfm2j\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.102622 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.102692 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:43Z","lastTransitionTime":"2026-03-09T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.102794 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4bnr\" (UniqueName: \"kubernetes.io/projected/2b59ff3b-540d-4385-b02b-f68349bb74bf-kube-api-access-d4bnr\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.106154 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdqpk\" (UniqueName: \"kubernetes.io/projected/bd11045a-d746-4b42-872c-8b8d1dd2d515-kube-api-access-zdqpk\") pod \"machine-config-daemon-97tth\" (UID: \"bd11045a-d746-4b42-872c-8b8d1dd2d515\") " pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.118871 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b59ff3b-540d-4385-b02b-f68349bb74bf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4tprh\" (UID: \"2b59ff3b-540d-4385-b02b-f68349bb74bf\") " pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.177193 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.189968 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.189996 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4792]: W0309 09:08:43.191413 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-a951adacb22df31757bc9ef665b8ebcb7725ab32ffefe74e475a61c882c659f4 WatchSource:0}: Error finding container a951adacb22df31757bc9ef665b8ebcb7725ab32ffefe74e475a61c882c659f4: Status 404 returned error can't find the container with id a951adacb22df31757bc9ef665b8ebcb7725ab32ffefe74e475a61c882c659f4 Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.194277 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.213289 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.213360 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k4kdn" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.215053 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.215115 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.215131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.215151 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.215166 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:43Z","lastTransitionTime":"2026-03-09T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.221257 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fgk47" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.228416 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.235651 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vgtc9" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.242581 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.249652 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4tprh" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.265526 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" Mar 09 09:08:43 crc kubenswrapper[4792]: W0309 09:08:43.280727 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-7a55e1ba6e76592e85cf197b035e13a12ab38e960ea0fa99eafa961194e96ccc WatchSource:0}: Error finding container 7a55e1ba6e76592e85cf197b035e13a12ab38e960ea0fa99eafa961194e96ccc: Status 404 returned error can't find the container with id 7a55e1ba6e76592e85cf197b035e13a12ab38e960ea0fa99eafa961194e96ccc Mar 09 09:08:43 crc kubenswrapper[4792]: W0309 09:08:43.282623 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cb7634b_66b7_4541_8e53_3e01a6cb41ca.slice/crio-bc4e32cf00eec63ccde19fb6eb61c111bbb1e4c4fd1d0d04972c13f7a65f043f WatchSource:0}: Error finding container bc4e32cf00eec63ccde19fb6eb61c111bbb1e4c4fd1d0d04972c13f7a65f043f: Status 404 returned error can't find the container with id bc4e32cf00eec63ccde19fb6eb61c111bbb1e4c4fd1d0d04972c13f7a65f043f Mar 09 09:08:43 crc kubenswrapper[4792]: W0309 09:08:43.293680 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod740550e5_d1a4_4f0c_8efd_1ccd8f9319e5.slice/crio-a5c2dc4c4f3013e1c887f80aebacfbc5517d8d00bc6fda475aa52ec17b3d5b7b WatchSource:0}: Error finding container a5c2dc4c4f3013e1c887f80aebacfbc5517d8d00bc6fda475aa52ec17b3d5b7b: Status 404 returned error can't find the container with id a5c2dc4c4f3013e1c887f80aebacfbc5517d8d00bc6fda475aa52ec17b3d5b7b Mar 09 09:08:43 crc kubenswrapper[4792]: W0309 09:08:43.299040 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84b6eb44_ca33_41a6_a951_2c66688ad860.slice/crio-afc3ab9d95dcc7e6057a3d6c394696f2a00d2302fcdecfaec6cd742194e0a219 WatchSource:0}: Error finding container afc3ab9d95dcc7e6057a3d6c394696f2a00d2302fcdecfaec6cd742194e0a219: Status 404 returned error can't find the container with id afc3ab9d95dcc7e6057a3d6c394696f2a00d2302fcdecfaec6cd742194e0a219 Mar 09 09:08:43 crc kubenswrapper[4792]: W0309 09:08:43.300495 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd11045a_d746_4b42_872c_8b8d1dd2d515.slice/crio-db1913132ac753c8bb1a12feb3f2e4d2a9e8bcda5478a54a48c8e5e16a5c3fa1 WatchSource:0}: Error finding container db1913132ac753c8bb1a12feb3f2e4d2a9e8bcda5478a54a48c8e5e16a5c3fa1: Status 404 returned error can't find the container with id db1913132ac753c8bb1a12feb3f2e4d2a9e8bcda5478a54a48c8e5e16a5c3fa1 Mar 09 09:08:43 crc kubenswrapper[4792]: W0309 09:08:43.310580 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod626ea896_2e5c_4478_a7be_34a19acc242d.slice/crio-70da56c10bf37bb1abe6cd558ce95db3fceec48835a6c97082d76995a32760bb WatchSource:0}: Error finding container 70da56c10bf37bb1abe6cd558ce95db3fceec48835a6c97082d76995a32760bb: Status 404 returned error can't find the container with id 70da56c10bf37bb1abe6cd558ce95db3fceec48835a6c97082d76995a32760bb Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.328915 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.328949 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.328985 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.329004 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.329021 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:43Z","lastTransitionTime":"2026-03-09T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.431147 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.431190 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.431201 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.431218 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.431229 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:43Z","lastTransitionTime":"2026-03-09T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.493722 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.493812 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.493857 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.493861 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.493917 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.493934 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.493945 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.493972 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:44.493956232 +0000 UTC m=+89.524156984 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.493987 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.493988 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:44.493982773 +0000 UTC m=+89.524183525 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.494017 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:44.494007783 +0000 UTC m=+89.524208535 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.533779 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.533826 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.533837 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.533851 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.533860 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:43Z","lastTransitionTime":"2026-03-09T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.594360 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.594580 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.594640 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.594774 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.594798 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.594811 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.594857 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:44.594842046 +0000 UTC m=+89.625042798 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.595225 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:08:44.595215326 +0000 UTC m=+89.625416078 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.595283 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: E0309 09:08:43.595308 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs podName:4711cce5-88a9-48c4-8e2e-522062e34a03 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:44.595300258 +0000 UTC m=+89.625501010 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs") pod "network-metrics-daemon-fttpc" (UID: "4711cce5-88a9-48c4-8e2e-522062e34a03") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.636314 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.636361 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.636372 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.636389 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.636406 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:43Z","lastTransitionTime":"2026-03-09T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.667505 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.668746 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.670969 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.673375 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.675472 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.676312 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.677797 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.678914 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.680590 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.681211 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.681794 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.682969 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.683659 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.684931 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.685613 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.686777 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.687562 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.688036 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.689490 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.690440 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.690985 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.692296 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.693610 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.695336 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.696366 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.698057 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.699405 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.700700 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.702647 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.703668 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.704384 4792 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.704506 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.707332 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.708292 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.709027 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.712026 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.713367 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.714426 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.715144 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.716516 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.717268 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.718404 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.720402 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.721788 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.722586 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.723852 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.725541 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.727645 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.728413 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.729426 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.729967 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.730625 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.731860 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.733624 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.738947 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.738984 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.738994 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.739012 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.739024 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:43Z","lastTransitionTime":"2026-03-09T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.841436 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.841471 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.841480 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.841494 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.841504 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:43Z","lastTransitionTime":"2026-03-09T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.943853 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.943899 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.943911 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.943931 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:43 crc kubenswrapper[4792]: I0309 09:08:43.943942 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:43Z","lastTransitionTime":"2026-03-09T09:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.017051 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.017118 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.017130 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"db1913132ac753c8bb1a12feb3f2e4d2a9e8bcda5478a54a48c8e5e16a5c3fa1"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.021480 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.021522 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a951adacb22df31757bc9ef665b8ebcb7725ab32ffefe74e475a61c882c659f4"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.022947 4792 generic.go:334] "Generic (PLEG): container finished" podID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerID="9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822" exitCode=0 Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.022999 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.023018 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerStarted","Data":"a5c2dc4c4f3013e1c887f80aebacfbc5517d8d00bc6fda475aa52ec17b3d5b7b"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.025786 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" event={"ID":"ceed39a1-2e4f-4630-bda0-57071ac26ee4","Type":"ContainerStarted","Data":"6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.025812 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" event={"ID":"ceed39a1-2e4f-4630-bda0-57071ac26ee4","Type":"ContainerStarted","Data":"648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.025823 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" event={"ID":"ceed39a1-2e4f-4630-bda0-57071ac26ee4","Type":"ContainerStarted","Data":"8ad59a0615df08a23124fc2005975e4e5df292dad921eaa02dcb7b318bf027ad"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.027089 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k4kdn" event={"ID":"0cb7634b-66b7-4541-8e53-3e01a6cb41ca","Type":"ContainerStarted","Data":"d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.027117 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k4kdn" event={"ID":"0cb7634b-66b7-4541-8e53-3e01a6cb41ca","Type":"ContainerStarted","Data":"bc4e32cf00eec63ccde19fb6eb61c111bbb1e4c4fd1d0d04972c13f7a65f043f"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.028781 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgtc9" event={"ID":"626ea896-2e5c-4478-a7be-34a19acc242d","Type":"ContainerStarted","Data":"57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.028807 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgtc9" event={"ID":"626ea896-2e5c-4478-a7be-34a19acc242d","Type":"ContainerStarted","Data":"70da56c10bf37bb1abe6cd558ce95db3fceec48835a6c97082d76995a32760bb"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.029999 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c354d7e66b46e912fcfe35052db0a24dd7d505bcbe5a893c3ddfdac46c72d729"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.031486 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.031560 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.031577 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7a55e1ba6e76592e85cf197b035e13a12ab38e960ea0fa99eafa961194e96ccc"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.033458 4792 generic.go:334] "Generic (PLEG): container finished" podID="2b59ff3b-540d-4385-b02b-f68349bb74bf" containerID="54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f" exitCode=0 Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.033517 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" event={"ID":"2b59ff3b-540d-4385-b02b-f68349bb74bf","Type":"ContainerDied","Data":"54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.033537 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" event={"ID":"2b59ff3b-540d-4385-b02b-f68349bb74bf","Type":"ContainerStarted","Data":"c8aab6f969b961bec47ff6161c073254e3ee35df70d4ba6f225dba35cadb74b5"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.035569 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.037199 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fgk47" event={"ID":"84b6eb44-ca33-41a6-a951-2c66688ad860","Type":"ContainerStarted","Data":"c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.037235 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fgk47" event={"ID":"84b6eb44-ca33-41a6-a951-2c66688ad860","Type":"ContainerStarted","Data":"afc3ab9d95dcc7e6057a3d6c394696f2a00d2302fcdecfaec6cd742194e0a219"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.050480 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.050525 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.050538 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.050555 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.050567 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:44Z","lastTransitionTime":"2026-03-09T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.054174 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.065215 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.072898 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.080631 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.087835 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.096563 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.110350 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.129701 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.138004 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.146816 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.153177 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.153233 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.153243 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.153258 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.153268 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:44Z","lastTransitionTime":"2026-03-09T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.161029 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.174277 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.187839 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.210440 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.223639 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.238661 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.252178 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.255142 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.255176 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.255184 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.255199 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.255208 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:44Z","lastTransitionTime":"2026-03-09T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.271175 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.285008 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.307199 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.319975 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.333436 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.343612 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.358348 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.358372 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.358380 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.358395 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.358404 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:44Z","lastTransitionTime":"2026-03-09T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.362442 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.372869 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.380423 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.391556 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:44Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.461113 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.461173 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.461185 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.461205 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.461218 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:44Z","lastTransitionTime":"2026-03-09T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.503623 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.503999 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.504023 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.503819 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.504162 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.504171 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:46.504149119 +0000 UTC m=+91.534349881 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.504215 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:46.50420208 +0000 UTC m=+91.534402832 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.504366 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.504429 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.504458 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.504543 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:46.504518648 +0000 UTC m=+91.534719390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.563371 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.563405 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.563418 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.563436 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.563448 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:44Z","lastTransitionTime":"2026-03-09T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.604897 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.605010 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.605060 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.605246 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.605301 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs podName:4711cce5-88a9-48c4-8e2e-522062e34a03 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:46.605284989 +0000 UTC m=+91.635485751 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs") pod "network-metrics-daemon-fttpc" (UID: "4711cce5-88a9-48c4-8e2e-522062e34a03") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.605600 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:08:46.605589607 +0000 UTC m=+91.635790369 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.605672 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.605685 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.605699 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.605727 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:46.60571933 +0000 UTC m=+91.635920092 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.661530 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.661667 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.661742 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.661802 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.662335 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.662505 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.662578 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:44 crc kubenswrapper[4792]: E0309 09:08:44.662639 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.665746 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.665779 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.665792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.665809 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.665821 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:44Z","lastTransitionTime":"2026-03-09T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.767750 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.767788 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.767801 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.767849 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.767860 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:44Z","lastTransitionTime":"2026-03-09T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.869985 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.870029 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.870044 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.870060 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.870091 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:44Z","lastTransitionTime":"2026-03-09T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.973889 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.973934 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.973948 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.973967 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:44 crc kubenswrapper[4792]: I0309 09:08:44.973981 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:44Z","lastTransitionTime":"2026-03-09T09:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.042583 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" event={"ID":"2b59ff3b-540d-4385-b02b-f68349bb74bf","Type":"ContainerStarted","Data":"86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.046856 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerStarted","Data":"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.046882 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerStarted","Data":"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.046892 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerStarted","Data":"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.046901 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerStarted","Data":"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.046910 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerStarted","Data":"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.046920 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerStarted","Data":"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.061993 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.073556 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.076031 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.076084 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.076097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.076114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.076127 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:45Z","lastTransitionTime":"2026-03-09T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.085413 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.098664 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.108489 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.125445 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.138020 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.151777 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.215331 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.220911 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.220959 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.220972 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.220990 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.221003 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:45Z","lastTransitionTime":"2026-03-09T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.232466 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.242433 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.258034 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.267305 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.276536 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.324640 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.324683 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.324696 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.324717 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.324728 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:45Z","lastTransitionTime":"2026-03-09T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.427460 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.427508 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.427518 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.427545 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.427556 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:45Z","lastTransitionTime":"2026-03-09T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.530911 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.530963 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.530981 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.531010 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.531027 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:45Z","lastTransitionTime":"2026-03-09T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.633756 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.633783 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.633792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.633807 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.633816 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:45Z","lastTransitionTime":"2026-03-09T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.674594 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.687203 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.701270 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.713120 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.724575 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.735941 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.736145 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.736317 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.736477 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.736644 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:45Z","lastTransitionTime":"2026-03-09T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.737997 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.797623 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.826491 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.839227 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.839267 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.839278 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.839294 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.839305 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:45Z","lastTransitionTime":"2026-03-09T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.853818 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.878243 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.895557 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.912769 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.929550 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.941589 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.941637 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.941651 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.941668 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.941681 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:45Z","lastTransitionTime":"2026-03-09T09:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:45 crc kubenswrapper[4792]: I0309 09:08:45.952046 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.044790 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.045176 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.045189 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.045207 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.045219 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.050824 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.053255 4792 generic.go:334] "Generic (PLEG): container finished" podID="2b59ff3b-540d-4385-b02b-f68349bb74bf" containerID="86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582" exitCode=0 Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.053292 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" event={"ID":"2b59ff3b-540d-4385-b02b-f68349bb74bf","Type":"ContainerDied","Data":"86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.067664 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.087810 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.096154 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.096187 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.096195 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.096209 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.096218 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.104802 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.119111 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.123252 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.123285 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.123294 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.123308 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.123321 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.125658 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.142422 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.146838 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.146866 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.146875 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.146888 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.146896 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.152638 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.162992 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.166133 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.166175 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.166189 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.166234 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.166245 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.173377 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.194478 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.198482 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.198520 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.198529 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.198543 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.198553 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.203205 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.209608 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.209870 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.211156 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.211241 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.211331 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.211392 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.211461 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.238333 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.256150 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.272310 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.284471 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.303242 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.313646 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.313670 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.313678 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.313692 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.313700 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.320412 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.329466 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.343321 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.355498 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.367809 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.381023 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.393686 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.403787 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.413506 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.415931 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.415977 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.415990 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.416009 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.416019 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.423920 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.432394 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.440824 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.449495 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.466870 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.475655 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.483972 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.518884 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.518916 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.518924 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.518938 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.518947 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.529388 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.529465 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.529490 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.529539 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.529584 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.529586 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.529630 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:50.529607297 +0000 UTC m=+95.559808049 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.529654 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:50.529644518 +0000 UTC m=+95.559845350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.529659 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.529672 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.529735 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:50.529699829 +0000 UTC m=+95.559900651 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.620939 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.620979 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.620993 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.621014 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.621027 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.630414 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.630556 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.630580 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:08:50.630558154 +0000 UTC m=+95.660758926 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.630658 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.630697 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.630717 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.630730 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.630791 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:50.630771779 +0000 UTC m=+95.660972571 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.630802 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.630847 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs podName:4711cce5-88a9-48c4-8e2e-522062e34a03 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:50.630834031 +0000 UTC m=+95.661034883 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs") pod "network-metrics-daemon-fttpc" (UID: "4711cce5-88a9-48c4-8e2e-522062e34a03") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.662160 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.662194 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.662185 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.662165 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.662329 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.662446 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.662633 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:08:46 crc kubenswrapper[4792]: E0309 09:08:46.662723 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.723629 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.723660 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.723687 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.723699 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.723709 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.830566 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.830597 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.830606 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.830618 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.830626 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.933045 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.933102 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.933111 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.933124 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:46 crc kubenswrapper[4792]: I0309 09:08:46.933151 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:46Z","lastTransitionTime":"2026-03-09T09:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.035723 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.035755 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.035765 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.035779 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.035788 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:47Z","lastTransitionTime":"2026-03-09T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.059530 4792 generic.go:334] "Generic (PLEG): container finished" podID="2b59ff3b-540d-4385-b02b-f68349bb74bf" containerID="fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780" exitCode=0 Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.059846 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" event={"ID":"2b59ff3b-540d-4385-b02b-f68349bb74bf","Type":"ContainerDied","Data":"fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780"} Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.075527 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.094829 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.115428 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.129476 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.138148 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.138174 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.138183 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.138197 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.138206 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:47Z","lastTransitionTime":"2026-03-09T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.144417 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.158222 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.172456 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.183910 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.197903 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.208394 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.220478 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.232393 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.240416 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.240453 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.240464 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.240480 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.240492 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:47Z","lastTransitionTime":"2026-03-09T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.244772 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.262180 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.343724 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.343751 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.343760 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.343773 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.343782 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:47Z","lastTransitionTime":"2026-03-09T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.446146 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.446185 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.446196 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.446213 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.446227 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:47Z","lastTransitionTime":"2026-03-09T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.549431 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.549488 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.549499 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.549520 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.549532 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:47Z","lastTransitionTime":"2026-03-09T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.651325 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.651358 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.651368 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.651386 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.651396 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:47Z","lastTransitionTime":"2026-03-09T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.680627 4792 scope.go:117] "RemoveContainer" containerID="420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.680793 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:08:47 crc kubenswrapper[4792]: E0309 09:08:47.680816 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.753577 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.753654 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.753669 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.753693 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.753706 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:47Z","lastTransitionTime":"2026-03-09T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.856355 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.856406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.856422 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.856441 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.856454 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:47Z","lastTransitionTime":"2026-03-09T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.958628 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.958689 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.958702 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.958715 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:47 crc kubenswrapper[4792]: I0309 09:08:47.958725 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:47Z","lastTransitionTime":"2026-03-09T09:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.061242 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.061274 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.061290 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.061306 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.061315 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:48Z","lastTransitionTime":"2026-03-09T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.065030 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerStarted","Data":"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f"} Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.067186 4792 generic.go:334] "Generic (PLEG): container finished" podID="2b59ff3b-540d-4385-b02b-f68349bb74bf" containerID="42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb" exitCode=0 Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.067710 4792 scope.go:117] "RemoveContainer" containerID="420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770" Mar 09 09:08:48 crc kubenswrapper[4792]: E0309 09:08:48.067875 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.067970 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" event={"ID":"2b59ff3b-540d-4385-b02b-f68349bb74bf","Type":"ContainerDied","Data":"42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb"} Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.083324 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.101287 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.112648 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.127012 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.137436 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.150189 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.165702 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.170116 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.170147 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.170157 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.170173 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.170183 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:48Z","lastTransitionTime":"2026-03-09T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.177670 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.188708 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.198778 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.213773 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.227027 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.246268 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.257210 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.268234 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:48Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.272626 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.272668 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.272679 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.272695 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.272704 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:48Z","lastTransitionTime":"2026-03-09T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.375103 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.375146 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.375158 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.375176 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.375188 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:48Z","lastTransitionTime":"2026-03-09T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.477645 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.477673 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.477682 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.477699 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.477709 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:48Z","lastTransitionTime":"2026-03-09T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.580477 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.580529 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.580545 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.580568 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.580582 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:48Z","lastTransitionTime":"2026-03-09T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.662120 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.662154 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.662163 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:48 crc kubenswrapper[4792]: E0309 09:08:48.662244 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.662172 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:48 crc kubenswrapper[4792]: E0309 09:08:48.662390 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:08:48 crc kubenswrapper[4792]: E0309 09:08:48.662468 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:08:48 crc kubenswrapper[4792]: E0309 09:08:48.662516 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.670823 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.682504 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.682533 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.682541 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.682555 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.682564 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:48Z","lastTransitionTime":"2026-03-09T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.784707 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.784752 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.784764 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.784784 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.784796 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:48Z","lastTransitionTime":"2026-03-09T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.890129 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.890171 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.890187 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.890208 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.890225 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:48Z","lastTransitionTime":"2026-03-09T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.992339 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.992362 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.992370 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.992382 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:48 crc kubenswrapper[4792]: I0309 09:08:48.992391 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:48Z","lastTransitionTime":"2026-03-09T09:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.075364 4792 generic.go:334] "Generic (PLEG): container finished" podID="2b59ff3b-540d-4385-b02b-f68349bb74bf" containerID="7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854" exitCode=0 Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.076221 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" event={"ID":"2b59ff3b-540d-4385-b02b-f68349bb74bf","Type":"ContainerDied","Data":"7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854"} Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.095595 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.096165 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.096204 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.096222 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.096245 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.096292 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:49Z","lastTransitionTime":"2026-03-09T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.111455 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.137344 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.148608 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.163252 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.198544 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.198580 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.198590 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.198605 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.198615 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:49Z","lastTransitionTime":"2026-03-09T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.202644 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.217416 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.230123 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.245602 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.262216 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.274751 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.290967 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.300729 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.300754 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.300764 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.300782 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.300794 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:49Z","lastTransitionTime":"2026-03-09T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.304712 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.324392 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.336313 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.353588 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:49Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.402645 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.402681 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.402690 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.402704 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.402715 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:49Z","lastTransitionTime":"2026-03-09T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.504891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.504927 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.504936 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.504950 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.504959 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:49Z","lastTransitionTime":"2026-03-09T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.607717 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.607745 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.607753 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.607767 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.607775 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:49Z","lastTransitionTime":"2026-03-09T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.710894 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.710957 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.710974 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.710998 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.711014 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:49Z","lastTransitionTime":"2026-03-09T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.813970 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.814004 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.814014 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.814030 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.814043 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:49Z","lastTransitionTime":"2026-03-09T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.920948 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.920974 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.920982 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.920994 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:49 crc kubenswrapper[4792]: I0309 09:08:49.921002 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:49Z","lastTransitionTime":"2026-03-09T09:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.023419 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.023446 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.023454 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.023466 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.023475 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:50Z","lastTransitionTime":"2026-03-09T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.081104 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerStarted","Data":"f07e7f862027f74671d62bbe39e3c53e3422377c13efbf27056f0c4d12ff522a"} Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.081471 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.081491 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.081500 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.085287 4792 generic.go:334] "Generic (PLEG): container finished" podID="2b59ff3b-540d-4385-b02b-f68349bb74bf" containerID="87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487" exitCode=0 Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.085523 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" event={"ID":"2b59ff3b-540d-4385-b02b-f68349bb74bf","Type":"ContainerDied","Data":"87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487"} Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.096562 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.106500 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.108104 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.110886 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.125594 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.125623 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.125630 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.125643 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.125652 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:50Z","lastTransitionTime":"2026-03-09T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.126494 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.137632 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.148983 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.161647 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.174012 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.184951 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.197349 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.206873 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.225841 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f07e7f862027f74671d62bbe39e3c53e3422377c13efbf27056f0c4d12ff522a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.230054 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.230117 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.230127 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.230143 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.230154 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:50Z","lastTransitionTime":"2026-03-09T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.238821 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.292874 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.301316 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.312433 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.323996 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.332226 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.332258 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.332268 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.332282 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.332292 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:50Z","lastTransitionTime":"2026-03-09T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.332459 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.342794 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.352185 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.362462 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.373018 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.386729 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.397645 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.408180 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.418058 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.428237 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.435397 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.435429 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.435436 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.435450 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.435459 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:50Z","lastTransitionTime":"2026-03-09T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.437659 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.447317 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.458563 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.466646 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.482112 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f07e7f862027f74671d62bbe39e3c53e3422377c13efbf27056f0c4d12ff522a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.497738 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:50Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.537229 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.537284 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.537297 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.537315 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.537326 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:50Z","lastTransitionTime":"2026-03-09T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.571738 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.571795 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.571831 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.571902 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.571935 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.571937 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.571950 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.571965 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.571985 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:58.571963103 +0000 UTC m=+103.602163855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.572008 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:58.571998184 +0000 UTC m=+103.602199056 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.572024 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:58.572018175 +0000 UTC m=+103.602218927 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.640342 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.640388 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.640397 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.640416 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.640426 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:50Z","lastTransitionTime":"2026-03-09T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.661646 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.661738 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.661754 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.661653 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.661940 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.662097 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.662206 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.662444 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.672577 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.672694 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.672748 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.672841 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.672876 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.672884 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.672954 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.672846 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:08:58.672827857 +0000 UTC m=+103.703028609 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.673021 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs podName:4711cce5-88a9-48c4-8e2e-522062e34a03 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:58.673001071 +0000 UTC m=+103.703201923 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs") pod "network-metrics-daemon-fttpc" (UID: "4711cce5-88a9-48c4-8e2e-522062e34a03") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:08:50 crc kubenswrapper[4792]: E0309 09:08:50.673040 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:08:58.673030252 +0000 UTC m=+103.703231134 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.742873 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.742916 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.742928 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.742945 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.742956 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:50Z","lastTransitionTime":"2026-03-09T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.845443 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.845484 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.845495 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.845509 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.845519 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:50Z","lastTransitionTime":"2026-03-09T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.948777 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.948815 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.948824 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.948838 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:50 crc kubenswrapper[4792]: I0309 09:08:50.948847 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:50Z","lastTransitionTime":"2026-03-09T09:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.051426 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.051475 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.051486 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.051506 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.051517 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:51Z","lastTransitionTime":"2026-03-09T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.091896 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" event={"ID":"2b59ff3b-540d-4385-b02b-f68349bb74bf","Type":"ContainerStarted","Data":"54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f"} Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.118750 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.138119 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f07e7f862027f74671d62bbe39e3c53e3422377c13efbf27056f0c4d12ff522a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.151288 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.154005 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.154038 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.154047 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.154063 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.154093 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:51Z","lastTransitionTime":"2026-03-09T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.161801 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.171143 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.179994 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.192879 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.203922 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.217396 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.230306 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.241395 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.253170 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.256540 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.256577 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.256589 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.256606 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.256618 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:51Z","lastTransitionTime":"2026-03-09T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.265349 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.276306 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.292134 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.308474 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:51Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.358886 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.358922 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.358932 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.358947 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.358958 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:51Z","lastTransitionTime":"2026-03-09T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.461429 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.461455 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.461463 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.461476 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.461484 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:51Z","lastTransitionTime":"2026-03-09T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.563584 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.563621 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.563629 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.563646 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.563656 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:51Z","lastTransitionTime":"2026-03-09T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.665339 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.665378 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.665392 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.665464 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.665478 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:51Z","lastTransitionTime":"2026-03-09T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.766992 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.767035 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.767045 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.767062 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.767089 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:51Z","lastTransitionTime":"2026-03-09T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.869172 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.869220 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.869230 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.869246 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.869259 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:51Z","lastTransitionTime":"2026-03-09T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.971678 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.971711 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.971721 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.971734 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:51 crc kubenswrapper[4792]: I0309 09:08:51.971761 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:51Z","lastTransitionTime":"2026-03-09T09:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.073752 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.073789 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.073800 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.073814 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.073823 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:52Z","lastTransitionTime":"2026-03-09T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.095259 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/0.log" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.097563 4792 generic.go:334] "Generic (PLEG): container finished" podID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerID="f07e7f862027f74671d62bbe39e3c53e3422377c13efbf27056f0c4d12ff522a" exitCode=1 Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.097598 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"f07e7f862027f74671d62bbe39e3c53e3422377c13efbf27056f0c4d12ff522a"} Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.098328 4792 scope.go:117] "RemoveContainer" containerID="f07e7f862027f74671d62bbe39e3c53e3422377c13efbf27056f0c4d12ff522a" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.110201 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.130142 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.140781 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.150458 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.160127 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.171517 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.176309 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.176346 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.176361 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.176382 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.176398 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:52Z","lastTransitionTime":"2026-03-09T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.183109 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.194467 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.206836 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.219845 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.232049 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.250192 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.262315 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.278015 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.285339 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.285376 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.285385 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.285400 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.285410 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:52Z","lastTransitionTime":"2026-03-09T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.301012 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f07e7f862027f74671d62bbe39e3c53e3422377c13efbf27056f0c4d12ff522a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f07e7f862027f74671d62bbe39e3c53e3422377c13efbf27056f0c4d12ff522a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"message\\\":\\\"o-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 09:08:51.912046 6504 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 09:08:51.912093 6504 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 09:08:51.912115 6504 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 09:08:51.912459 6504 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 09:08:51.912487 6504 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 09:08:51.912501 6504 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 09:08:51.912522 6504 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 09:08:51.912539 6504 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 09:08:51.913156 6504 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 09:08:51.913214 6504 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 09:08:51.913281 6504 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 09:08:51.913312 6504 factory.go:656] Stopping watch factory\\\\nI0309 09:08:51.913334 6504 ovnkube.go:599] Stopped ovnkube\\\\nI0309 09:08:51.913421 6504 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 09:08:51.913435 6504 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 09:08:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.313648 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:52Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.387715 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.387754 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.387765 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.387781 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.387793 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:52Z","lastTransitionTime":"2026-03-09T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.490205 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.490240 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.490250 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.490267 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.490282 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:52Z","lastTransitionTime":"2026-03-09T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.592149 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.592180 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.592192 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.592208 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.592220 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:52Z","lastTransitionTime":"2026-03-09T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.662183 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:52 crc kubenswrapper[4792]: E0309 09:08:52.662289 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.662569 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:52 crc kubenswrapper[4792]: E0309 09:08:52.662618 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.662657 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:52 crc kubenswrapper[4792]: E0309 09:08:52.662700 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.662732 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:52 crc kubenswrapper[4792]: E0309 09:08:52.662768 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.694733 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.694759 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.694767 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.694780 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.694788 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:52Z","lastTransitionTime":"2026-03-09T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.796740 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.796790 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.796800 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.796813 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.796830 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:52Z","lastTransitionTime":"2026-03-09T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.898706 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.898735 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.898744 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.898757 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:52 crc kubenswrapper[4792]: I0309 09:08:52.898765 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:52Z","lastTransitionTime":"2026-03-09T09:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.001309 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.001335 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.001343 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.001357 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.001366 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:53Z","lastTransitionTime":"2026-03-09T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.101707 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/1.log" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.103251 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/0.log" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.103364 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.103390 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.103398 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.103410 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.103419 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:53Z","lastTransitionTime":"2026-03-09T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.105529 4792 generic.go:334] "Generic (PLEG): container finished" podID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerID="f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0" exitCode=1 Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.105557 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0"} Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.105584 4792 scope.go:117] "RemoveContainer" containerID="f07e7f862027f74671d62bbe39e3c53e3422377c13efbf27056f0c4d12ff522a" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.106150 4792 scope.go:117] "RemoveContainer" containerID="f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0" Mar 09 09:08:53 crc kubenswrapper[4792]: E0309 09:08:53.106283 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.129085 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.142238 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.155596 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.165253 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.176304 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.188210 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.199441 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.205846 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.205892 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.205909 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.205930 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.205944 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:53Z","lastTransitionTime":"2026-03-09T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.212291 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.225270 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.243216 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.256270 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.280458 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f07e7f862027f74671d62bbe39e3c53e3422377c13efbf27056f0c4d12ff522a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"message\\\":\\\"o-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 09:08:51.912046 6504 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0309 09:08:51.912093 6504 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0309 09:08:51.912115 6504 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0309 09:08:51.912459 6504 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0309 09:08:51.912487 6504 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 09:08:51.912501 6504 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 09:08:51.912522 6504 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 09:08:51.912539 6504 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 09:08:51.913156 6504 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 09:08:51.913214 6504 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 09:08:51.913281 6504 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 09:08:51.913312 6504 factory.go:656] Stopping watch factory\\\\nI0309 09:08:51.913334 6504 ovnkube.go:599] Stopped ovnkube\\\\nI0309 09:08:51.913421 6504 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 09:08:51.913435 6504 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 09:08:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:08:52Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 09:08:52.970850 6651 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-canary/ingress-canary\\\\\\\"}\\\\nI0309 09:08:52.971742 6651 services_controller.go:360] Finished syncing service ingress-canary on namespace openshift-ingress-canary for network=default : 2.04927ms\\\\nI0309 09:08:52.971757 6651 services_controller.go:356] Processing sync for service openshift-cluster-samples-operator/metrics for network=default\\\\nI0309 09:08:52.971763 6651 services_controller.go:360] Finished syncing service metrics on namespace openshift-cluster-samples-operator for network=default : 7.191µs\\\\nI0309 09:08:52.971772 6651 services_controller.go:356] Processing sync for service openshift-console/console for network=default\\\\nF0309 09:08:52.971780 6651 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to sha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.293223 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.303886 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.307545 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.307604 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.307622 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.307643 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.307655 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:53Z","lastTransitionTime":"2026-03-09T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.316877 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.327833 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:53Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.409959 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.409985 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.409992 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.410004 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.410012 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:53Z","lastTransitionTime":"2026-03-09T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.512149 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.512178 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.512188 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.512202 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.512211 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:53Z","lastTransitionTime":"2026-03-09T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.613885 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.613909 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.613917 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.613929 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.613938 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:53Z","lastTransitionTime":"2026-03-09T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.716112 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.716158 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.716170 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.716185 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.716196 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:53Z","lastTransitionTime":"2026-03-09T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.818496 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.818542 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.818551 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.818565 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.818574 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:53Z","lastTransitionTime":"2026-03-09T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.920724 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.920747 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.920755 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.920768 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:53 crc kubenswrapper[4792]: I0309 09:08:53.920777 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:53Z","lastTransitionTime":"2026-03-09T09:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.023245 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.023281 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.023293 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.023309 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.023320 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:54Z","lastTransitionTime":"2026-03-09T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.111146 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/1.log" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.115681 4792 scope.go:117] "RemoveContainer" containerID="f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0" Mar 09 09:08:54 crc kubenswrapper[4792]: E0309 09:08:54.115888 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.125990 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.126040 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.126055 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.126121 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.126140 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:54Z","lastTransitionTime":"2026-03-09T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.132012 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.152994 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:08:52Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 09:08:52.970850 6651 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-canary/ingress-canary\\\\\\\"}\\\\nI0309 09:08:52.971742 6651 services_controller.go:360] Finished syncing service ingress-canary on namespace openshift-ingress-canary for network=default : 2.04927ms\\\\nI0309 09:08:52.971757 6651 services_controller.go:356] Processing sync for service openshift-cluster-samples-operator/metrics for network=default\\\\nI0309 09:08:52.971763 6651 services_controller.go:360] Finished syncing service metrics on namespace openshift-cluster-samples-operator for network=default : 7.191µs\\\\nI0309 09:08:52.971772 6651 services_controller.go:356] Processing sync for service openshift-console/console for network=default\\\\nF0309 09:08:52.971780 6651 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to sha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.168411 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.179317 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.189846 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.205114 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.218608 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.227996 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.228052 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.228061 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.228104 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.228116 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:54Z","lastTransitionTime":"2026-03-09T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.235766 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.247081 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.262814 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.273276 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.283415 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.293289 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.305490 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.317045 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.327296 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:54Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.329782 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.329818 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.329826 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.329840 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.329849 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:54Z","lastTransitionTime":"2026-03-09T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.432039 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.432108 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.432118 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.432135 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.432146 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:54Z","lastTransitionTime":"2026-03-09T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.534085 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.534120 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.534128 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.534142 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.534152 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:54Z","lastTransitionTime":"2026-03-09T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.636233 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.636274 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.636285 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.636331 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.636343 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:54Z","lastTransitionTime":"2026-03-09T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.661721 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.661828 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:54 crc kubenswrapper[4792]: E0309 09:08:54.661928 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.661953 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.661968 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:54 crc kubenswrapper[4792]: E0309 09:08:54.662064 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:08:54 crc kubenswrapper[4792]: E0309 09:08:54.662137 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:08:54 crc kubenswrapper[4792]: E0309 09:08:54.662191 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.738158 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.738188 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.738197 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.738209 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.738218 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:54Z","lastTransitionTime":"2026-03-09T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.840770 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.840808 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.840817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.840830 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.840840 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:54Z","lastTransitionTime":"2026-03-09T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.943961 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.944282 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.944417 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.944602 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:54 crc kubenswrapper[4792]: I0309 09:08:54.944802 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:54Z","lastTransitionTime":"2026-03-09T09:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.047678 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.048339 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.048479 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.048652 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.048814 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:55Z","lastTransitionTime":"2026-03-09T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.151634 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.152477 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.152715 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.152860 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.152980 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:55Z","lastTransitionTime":"2026-03-09T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.256149 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.256381 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.256465 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.256580 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.256662 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:55Z","lastTransitionTime":"2026-03-09T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.358938 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.359342 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.359457 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.359629 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.359751 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:55Z","lastTransitionTime":"2026-03-09T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.463382 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.463438 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.463454 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.463471 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.463482 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:55Z","lastTransitionTime":"2026-03-09T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.566270 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.566306 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.566319 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.566335 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.566346 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:55Z","lastTransitionTime":"2026-03-09T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.669954 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.670001 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.670013 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.670030 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.670042 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:55Z","lastTransitionTime":"2026-03-09T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.674472 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.687175 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.697719 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.709965 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.725672 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.740331 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.752246 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.765656 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.774281 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.774336 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.774343 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.774357 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.774365 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:55Z","lastTransitionTime":"2026-03-09T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.777816 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.789027 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.799545 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.809594 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.821270 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.833863 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.846331 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.880218 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.880243 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.880251 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.880264 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.880273 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:55Z","lastTransitionTime":"2026-03-09T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.891885 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:08:52Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 09:08:52.970850 6651 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-canary/ingress-canary\\\\\\\"}\\\\nI0309 09:08:52.971742 6651 services_controller.go:360] Finished syncing service ingress-canary on namespace openshift-ingress-canary for network=default : 2.04927ms\\\\nI0309 09:08:52.971757 6651 services_controller.go:356] Processing sync for service openshift-cluster-samples-operator/metrics for network=default\\\\nI0309 09:08:52.971763 6651 services_controller.go:360] Finished syncing service metrics on namespace openshift-cluster-samples-operator for network=default : 7.191µs\\\\nI0309 09:08:52.971772 6651 services_controller.go:356] Processing sync for service openshift-console/console for network=default\\\\nF0309 09:08:52.971780 6651 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to sha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:55Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.982756 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.982794 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.982801 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.982814 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:55 crc kubenswrapper[4792]: I0309 09:08:55.982822 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:55Z","lastTransitionTime":"2026-03-09T09:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.085154 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.085193 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.085204 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.085219 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.085230 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.187429 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.187458 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.187468 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.187483 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.187494 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.256803 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.256834 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.256844 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.256858 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.256867 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: E0309 09:08:56.275546 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:56Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.280013 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.280096 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.280114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.280137 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.280152 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: E0309 09:08:56.292938 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:56Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.296840 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.296865 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.296875 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.296888 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.296897 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: E0309 09:08:56.312618 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:56Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.315891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.316003 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.316100 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.316177 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.316241 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: E0309 09:08:56.327094 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:56Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.330948 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.330997 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.331014 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.331036 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.331054 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: E0309 09:08:56.344694 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:08:56Z is after 2025-08-24T17:21:41Z" Mar 09 09:08:56 crc kubenswrapper[4792]: E0309 09:08:56.344804 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.346448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.346471 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.346479 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.346492 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.346501 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.448701 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.448909 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.448990 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.449089 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.449149 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.551402 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.551664 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.551729 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.551795 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.551856 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.653454 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.653488 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.653498 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.653513 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.653524 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.661993 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.662002 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.662019 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.661992 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:56 crc kubenswrapper[4792]: E0309 09:08:56.662113 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:08:56 crc kubenswrapper[4792]: E0309 09:08:56.662233 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:08:56 crc kubenswrapper[4792]: E0309 09:08:56.662360 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:08:56 crc kubenswrapper[4792]: E0309 09:08:56.662427 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.755519 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.755737 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.755822 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.755904 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.755980 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.858680 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.858735 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.858749 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.858767 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.858779 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.960892 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.960935 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.960947 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.960964 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:56 crc kubenswrapper[4792]: I0309 09:08:56.960979 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:56Z","lastTransitionTime":"2026-03-09T09:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.064422 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.064689 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.064806 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.064897 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.064986 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:57Z","lastTransitionTime":"2026-03-09T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.168016 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.168302 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.168386 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.168468 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.168593 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:57Z","lastTransitionTime":"2026-03-09T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.270283 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.270323 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.270337 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.270355 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.270369 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:57Z","lastTransitionTime":"2026-03-09T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.372710 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.372765 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.372780 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.372796 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.372805 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:57Z","lastTransitionTime":"2026-03-09T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.475152 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.475201 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.475212 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.475231 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.475245 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:57Z","lastTransitionTime":"2026-03-09T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.586861 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.586954 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.586974 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.586999 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.587024 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:57Z","lastTransitionTime":"2026-03-09T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.690148 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.690196 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.690210 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.690228 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.690242 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:57Z","lastTransitionTime":"2026-03-09T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.792501 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.792535 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.792546 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.792561 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.792572 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:57Z","lastTransitionTime":"2026-03-09T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.895474 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.895572 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.895591 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.895612 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.895626 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:57Z","lastTransitionTime":"2026-03-09T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.998783 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.998833 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.998843 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.998857 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:57 crc kubenswrapper[4792]: I0309 09:08:57.998865 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:57Z","lastTransitionTime":"2026-03-09T09:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.102427 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.102471 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.102480 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.102495 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.102506 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:58Z","lastTransitionTime":"2026-03-09T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.205394 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.205426 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.205437 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.205450 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.205461 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:58Z","lastTransitionTime":"2026-03-09T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.307851 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.307890 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.307898 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.307912 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.307922 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:58Z","lastTransitionTime":"2026-03-09T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.410219 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.410266 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.410276 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.410292 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.410304 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:58Z","lastTransitionTime":"2026-03-09T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.512787 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.512813 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.512822 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.512834 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.512842 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:58Z","lastTransitionTime":"2026-03-09T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.614581 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.614634 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.614653 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.614670 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.614683 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:58Z","lastTransitionTime":"2026-03-09T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.647197 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.647267 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.647293 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.647392 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.647438 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.647453 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:09:14.647437713 +0000 UTC m=+119.677638475 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.647460 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.647544 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:09:14.647524935 +0000 UTC m=+119.677725687 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.647552 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.647593 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.647629 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:09:14.647622107 +0000 UTC m=+119.677822859 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.661568 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.661670 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.661758 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.661798 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.661885 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.661958 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.662027 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.662047 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.679793 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.717326 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.717366 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.717377 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.717391 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.717401 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:58Z","lastTransitionTime":"2026-03-09T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.747854 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.747974 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.748022 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.748100 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:09:14.748063731 +0000 UTC m=+119.778264483 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.748220 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.748243 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.748328 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs podName:4711cce5-88a9-48c4-8e2e-522062e34a03 nodeName:}" failed. No retries permitted until 2026-03-09 09:09:14.748308197 +0000 UTC m=+119.778508979 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs") pod "network-metrics-daemon-fttpc" (UID: "4711cce5-88a9-48c4-8e2e-522062e34a03") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.748258 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.748378 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:58 crc kubenswrapper[4792]: E0309 09:08:58.748442 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:09:14.748421789 +0000 UTC m=+119.778622541 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.819827 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.819869 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.819880 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.819920 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.819933 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:58Z","lastTransitionTime":"2026-03-09T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.921842 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.921890 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.921899 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.921912 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:58 crc kubenswrapper[4792]: I0309 09:08:58.921923 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:58Z","lastTransitionTime":"2026-03-09T09:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.024690 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.024718 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.024746 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.024759 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.024767 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:59Z","lastTransitionTime":"2026-03-09T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.127979 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.128027 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.128038 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.128055 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.128081 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:59Z","lastTransitionTime":"2026-03-09T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.230790 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.230889 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.230906 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.230934 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.230956 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:59Z","lastTransitionTime":"2026-03-09T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.333131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.333191 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.333208 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.333231 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.333247 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:59Z","lastTransitionTime":"2026-03-09T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.436544 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.436677 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.436696 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.436721 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.436738 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:59Z","lastTransitionTime":"2026-03-09T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.539741 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.539801 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.539811 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.539824 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.539834 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:59Z","lastTransitionTime":"2026-03-09T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.643376 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.643444 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.643460 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.643494 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.643508 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:59Z","lastTransitionTime":"2026-03-09T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.746510 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.746577 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.746614 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.746639 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.746656 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:59Z","lastTransitionTime":"2026-03-09T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.849335 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.849372 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.849383 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.849399 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.849410 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:59Z","lastTransitionTime":"2026-03-09T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.952483 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.952540 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.952549 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.952564 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:08:59 crc kubenswrapper[4792]: I0309 09:08:59.952572 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:08:59Z","lastTransitionTime":"2026-03-09T09:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.055548 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.055591 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.055600 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.055616 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.055626 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:00Z","lastTransitionTime":"2026-03-09T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.158126 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.158169 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.158181 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.158198 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.158212 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:00Z","lastTransitionTime":"2026-03-09T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.261181 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.261257 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.261274 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.261297 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.261314 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:00Z","lastTransitionTime":"2026-03-09T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.363358 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.363433 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.363443 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.363458 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.363468 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:00Z","lastTransitionTime":"2026-03-09T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.465347 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.465405 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.465420 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.465437 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.465448 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:00Z","lastTransitionTime":"2026-03-09T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.567292 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.567323 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.567333 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.567348 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.567358 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:00Z","lastTransitionTime":"2026-03-09T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.662345 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.662345 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.662380 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.662423 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:00 crc kubenswrapper[4792]: E0309 09:09:00.662553 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:00 crc kubenswrapper[4792]: E0309 09:09:00.662607 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:00 crc kubenswrapper[4792]: E0309 09:09:00.662644 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:00 crc kubenswrapper[4792]: E0309 09:09:00.662663 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.670244 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.670270 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.670277 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.670289 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.670299 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:00Z","lastTransitionTime":"2026-03-09T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.773799 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.773916 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.773940 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.773970 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.773990 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:00Z","lastTransitionTime":"2026-03-09T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.876908 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.876944 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.876956 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.876975 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.876988 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:00Z","lastTransitionTime":"2026-03-09T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.980780 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.980853 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.980881 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.980912 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:00 crc kubenswrapper[4792]: I0309 09:09:00.980937 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:00Z","lastTransitionTime":"2026-03-09T09:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.085496 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.085550 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.085567 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.085589 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.085607 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:01Z","lastTransitionTime":"2026-03-09T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.188145 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.188216 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.188238 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.188267 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.188289 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:01Z","lastTransitionTime":"2026-03-09T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.291218 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.291276 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.291296 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.291320 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.291337 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:01Z","lastTransitionTime":"2026-03-09T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.394241 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.394304 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.394322 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.394349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.394369 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:01Z","lastTransitionTime":"2026-03-09T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.496724 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.496770 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.496785 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.496807 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.496822 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:01Z","lastTransitionTime":"2026-03-09T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.599660 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.599716 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.599734 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.599759 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.599777 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:01Z","lastTransitionTime":"2026-03-09T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.662901 4792 scope.go:117] "RemoveContainer" containerID="420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770" Mar 09 09:09:01 crc kubenswrapper[4792]: E0309 09:09:01.663446 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.702513 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.702564 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.702579 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.702601 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.702619 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:01Z","lastTransitionTime":"2026-03-09T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.805042 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.805157 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.805171 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.805191 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.805202 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:01Z","lastTransitionTime":"2026-03-09T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.907817 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.907880 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.907896 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.907922 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:01 crc kubenswrapper[4792]: I0309 09:09:01.907939 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:01Z","lastTransitionTime":"2026-03-09T09:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.011435 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.011502 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.011518 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.011542 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.011559 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:02Z","lastTransitionTime":"2026-03-09T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.114774 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.114846 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.114886 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.114916 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.114937 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:02Z","lastTransitionTime":"2026-03-09T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.217598 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.217661 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.217682 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.217708 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.217737 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:02Z","lastTransitionTime":"2026-03-09T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.321322 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.321397 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.321416 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.321448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.321478 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:02Z","lastTransitionTime":"2026-03-09T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.426777 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.426812 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.426821 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.426835 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.426845 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:02Z","lastTransitionTime":"2026-03-09T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.529761 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.529827 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.529846 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.529871 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.529889 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:02Z","lastTransitionTime":"2026-03-09T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.632472 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.632524 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.632537 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.632556 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.632571 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:02Z","lastTransitionTime":"2026-03-09T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.662230 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.662370 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:02 crc kubenswrapper[4792]: E0309 09:09:02.662609 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.662648 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.662679 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:02 crc kubenswrapper[4792]: E0309 09:09:02.662845 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:02 crc kubenswrapper[4792]: E0309 09:09:02.662937 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:02 crc kubenswrapper[4792]: E0309 09:09:02.663018 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.735360 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.735406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.735417 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.735439 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.735452 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:02Z","lastTransitionTime":"2026-03-09T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.838205 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.838244 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.838254 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.838269 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.838279 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:02Z","lastTransitionTime":"2026-03-09T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.941462 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.941496 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.941505 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.941521 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:02 crc kubenswrapper[4792]: I0309 09:09:02.941529 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:02Z","lastTransitionTime":"2026-03-09T09:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.043975 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.044063 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.044144 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.044182 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.044217 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:03Z","lastTransitionTime":"2026-03-09T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.146577 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.146743 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.146767 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.146836 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.146855 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:03Z","lastTransitionTime":"2026-03-09T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.249350 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.249415 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.249458 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.249488 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.249511 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:03Z","lastTransitionTime":"2026-03-09T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.352430 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.352476 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.352492 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.352517 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.352534 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:03Z","lastTransitionTime":"2026-03-09T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.455566 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.455626 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.455643 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.455668 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.455704 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:03Z","lastTransitionTime":"2026-03-09T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.558815 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.558877 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.558899 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.558968 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.559031 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:03Z","lastTransitionTime":"2026-03-09T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.662366 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.662419 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.662439 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.662460 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.662475 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:03Z","lastTransitionTime":"2026-03-09T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.766233 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.766290 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.766307 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.766329 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.766347 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:03Z","lastTransitionTime":"2026-03-09T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.869283 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.869338 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.869357 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.869380 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.869399 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:03Z","lastTransitionTime":"2026-03-09T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.972763 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.972812 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.972833 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.972860 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:03 crc kubenswrapper[4792]: I0309 09:09:03.972885 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:03Z","lastTransitionTime":"2026-03-09T09:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.075188 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.075241 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.075257 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.075280 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.075296 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:04Z","lastTransitionTime":"2026-03-09T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.177548 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.177591 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.177602 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.177616 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.177624 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:04Z","lastTransitionTime":"2026-03-09T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.280210 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.280247 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.280261 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.280280 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.280295 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:04Z","lastTransitionTime":"2026-03-09T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.382710 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.382749 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.382761 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.382778 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.382789 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:04Z","lastTransitionTime":"2026-03-09T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.484631 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.485229 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.485248 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.485265 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.485275 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:04Z","lastTransitionTime":"2026-03-09T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.588540 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.588606 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.588627 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.588646 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.588663 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:04Z","lastTransitionTime":"2026-03-09T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.661888 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.661950 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.661901 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:04 crc kubenswrapper[4792]: E0309 09:09:04.662108 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:04 crc kubenswrapper[4792]: E0309 09:09:04.662248 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.662267 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:04 crc kubenswrapper[4792]: E0309 09:09:04.662360 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:04 crc kubenswrapper[4792]: E0309 09:09:04.662478 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.691187 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.691237 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.691248 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.691265 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.691279 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:04Z","lastTransitionTime":"2026-03-09T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.794038 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.794155 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.794184 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.794212 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.794232 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:04Z","lastTransitionTime":"2026-03-09T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.897136 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.897182 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.897199 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.897220 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.897235 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:04Z","lastTransitionTime":"2026-03-09T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.999615 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.999672 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:04 crc kubenswrapper[4792]: I0309 09:09:04.999694 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:04.999721 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:04.999741 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:04Z","lastTransitionTime":"2026-03-09T09:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.102891 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.102928 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.102944 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.102959 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.102969 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:05Z","lastTransitionTime":"2026-03-09T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.205361 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.205403 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.205413 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.205428 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.205439 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:05Z","lastTransitionTime":"2026-03-09T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.307528 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.307570 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.307579 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.307593 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.307603 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:05Z","lastTransitionTime":"2026-03-09T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.410171 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.410207 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.410218 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.410233 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.410243 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:05Z","lastTransitionTime":"2026-03-09T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.514867 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.515124 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.515153 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.515184 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.515206 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:05Z","lastTransitionTime":"2026-03-09T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.618551 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.618611 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.618628 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.618652 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.618670 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:05Z","lastTransitionTime":"2026-03-09T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.664025 4792 scope.go:117] "RemoveContainer" containerID="f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.683344 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.716929 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.723875 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.724184 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.724193 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.724208 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.724217 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:05Z","lastTransitionTime":"2026-03-09T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.738098 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.753937 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.770676 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.788482 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.806953 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.826721 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.826757 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.826768 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.826784 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.826795 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:05Z","lastTransitionTime":"2026-03-09T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.828744 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.841869 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.855348 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.874742 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.896692 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.908167 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.925540 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:08:52Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 09:08:52.970850 6651 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-canary/ingress-canary\\\\\\\"}\\\\nI0309 09:08:52.971742 6651 services_controller.go:360] Finished syncing service ingress-canary on namespace openshift-ingress-canary for network=default : 2.04927ms\\\\nI0309 09:08:52.971757 6651 services_controller.go:356] Processing sync for service openshift-cluster-samples-operator/metrics for network=default\\\\nI0309 09:08:52.971763 6651 services_controller.go:360] Finished syncing service metrics on namespace openshift-cluster-samples-operator for network=default : 7.191µs\\\\nI0309 09:08:52.971772 6651 services_controller.go:356] Processing sync for service openshift-console/console for network=default\\\\nF0309 09:08:52.971780 6651 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to sha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.929207 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.929230 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.929241 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.929259 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.929270 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:05Z","lastTransitionTime":"2026-03-09T09:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.936198 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.953047 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:05 crc kubenswrapper[4792]: I0309 09:09:05.963996 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:05Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.031945 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.031981 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.031994 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.032012 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.032023 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.134516 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.134719 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.134866 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.135028 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.135224 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.156706 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/1.log" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.159399 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerStarted","Data":"48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66"} Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.159778 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.173289 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.183833 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.197329 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.209320 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.218540 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.237904 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.238139 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.238206 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.238300 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.238360 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.242413 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.255760 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.270137 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.285342 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.301213 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.317604 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.332296 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.341188 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.341216 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.341226 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.341245 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.341258 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.352036 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:08:52Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 09:08:52.970850 6651 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-canary/ingress-canary\\\\\\\"}\\\\nI0309 09:08:52.971742 6651 services_controller.go:360] Finished syncing service ingress-canary on namespace openshift-ingress-canary for network=default : 2.04927ms\\\\nI0309 09:08:52.971757 6651 services_controller.go:356] Processing sync for service openshift-cluster-samples-operator/metrics for network=default\\\\nI0309 09:08:52.971763 6651 services_controller.go:360] Finished syncing service metrics on namespace openshift-cluster-samples-operator for network=default : 7.191µs\\\\nI0309 09:08:52.971772 6651 services_controller.go:356] Processing sync for service openshift-console/console for network=default\\\\nF0309 09:08:52.971780 6651 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to sha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.364337 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.372564 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.381032 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.388719 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.443738 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.443765 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.443773 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.443787 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.443795 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.472114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.472142 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.472151 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.472162 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.472170 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: E0309 09:09:06.483171 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.486863 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.487051 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.487151 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.487235 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.487292 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: E0309 09:09:06.496868 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.499543 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.499563 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.499570 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.499583 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.499592 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: E0309 09:09:06.508597 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.511244 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.511410 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.511533 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.511622 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.511680 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: E0309 09:09:06.522328 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.525595 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.525659 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.525676 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.525699 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.525717 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: E0309 09:09:06.537250 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:06Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:06 crc kubenswrapper[4792]: E0309 09:09:06.537569 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.545683 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.545719 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.545732 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.545750 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.545762 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.648104 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.648289 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.648347 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.648411 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.648499 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.661676 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.661755 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.661689 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.661777 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:06 crc kubenswrapper[4792]: E0309 09:09:06.661977 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:06 crc kubenswrapper[4792]: E0309 09:09:06.662110 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:06 crc kubenswrapper[4792]: E0309 09:09:06.662237 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:06 crc kubenswrapper[4792]: E0309 09:09:06.662355 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.751620 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.751714 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.751770 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.751792 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.751840 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.855371 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.855416 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.855427 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.855445 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.855459 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.958224 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.958294 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.958311 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.958334 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:06 crc kubenswrapper[4792]: I0309 09:09:06.958353 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:06Z","lastTransitionTime":"2026-03-09T09:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.061428 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.061487 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.061522 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.061551 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.061573 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:07Z","lastTransitionTime":"2026-03-09T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.163111 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.163146 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.163154 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.163167 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.163175 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:07Z","lastTransitionTime":"2026-03-09T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.163831 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/2.log" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.164323 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/1.log" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.166013 4792 generic.go:334] "Generic (PLEG): container finished" podID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerID="48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66" exitCode=1 Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.166039 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66"} Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.166063 4792 scope.go:117] "RemoveContainer" containerID="f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.166685 4792 scope.go:117] "RemoveContainer" containerID="48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66" Mar 09 09:09:07 crc kubenswrapper[4792]: E0309 09:09:07.166804 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.181937 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.202394 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2644dfb318576d2bee9a191d85edd623387dafd72ab37ff2a4309a552b738e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:08:52Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0309 09:08:52.970850 6651 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-canary/ingress-canary\\\\\\\"}\\\\nI0309 09:08:52.971742 6651 services_controller.go:360] Finished syncing service ingress-canary on namespace openshift-ingress-canary for network=default : 2.04927ms\\\\nI0309 09:08:52.971757 6651 services_controller.go:356] Processing sync for service openshift-cluster-samples-operator/metrics for network=default\\\\nI0309 09:08:52.971763 6651 services_controller.go:360] Finished syncing service metrics on namespace openshift-cluster-samples-operator for network=default : 7.191µs\\\\nI0309 09:08:52.971772 6651 services_controller.go:356] Processing sync for service openshift-console/console for network=default\\\\nF0309 09:08:52.971780 6651 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to sha\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"45 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0309 09:09:06.535081 6845 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535093 6845 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.521797ms\\\\nI0309 09:09:06.535099 6845 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-k4kdn in node crc\\\\nI0309 09:09:06.535103 6845 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-k4kdn after 0 failed attempt(s)\\\\nI0309 09:09:06.535109 6845 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535110 6845 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nF0309 09:09:06.535119 6845 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.215189 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.225282 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.235676 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.246620 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.262295 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.269327 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.269372 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.269384 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.269407 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.269423 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:07Z","lastTransitionTime":"2026-03-09T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.281289 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.297700 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.310836 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.329539 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.344147 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.359382 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.372501 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.372678 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.372834 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.372984 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.373128 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:07Z","lastTransitionTime":"2026-03-09T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.376247 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.388921 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.405635 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.419317 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:07Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.475667 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.475715 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.475736 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.475760 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.475777 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:07Z","lastTransitionTime":"2026-03-09T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.577736 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.578279 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.578358 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.578443 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.578554 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:07Z","lastTransitionTime":"2026-03-09T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.680339 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.680380 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.680390 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.680406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.680416 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:07Z","lastTransitionTime":"2026-03-09T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.782992 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.783044 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.783061 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.783123 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.783160 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:07Z","lastTransitionTime":"2026-03-09T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.885640 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.885676 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.885686 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.885702 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.885714 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:07Z","lastTransitionTime":"2026-03-09T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.988237 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.988288 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.988304 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.988325 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:07 crc kubenswrapper[4792]: I0309 09:09:07.988342 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:07Z","lastTransitionTime":"2026-03-09T09:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.091581 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.091647 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.091666 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.091690 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.091706 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:08Z","lastTransitionTime":"2026-03-09T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.172439 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/2.log" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.176908 4792 scope.go:117] "RemoveContainer" containerID="48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66" Mar 09 09:09:08 crc kubenswrapper[4792]: E0309 09:09:08.177098 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.194503 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.194560 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.194576 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.194595 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.194611 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:08Z","lastTransitionTime":"2026-03-09T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.200689 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.214615 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.238282 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.259924 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.278128 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.291563 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.297382 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.297424 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.297439 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.297458 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.297472 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:08Z","lastTransitionTime":"2026-03-09T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.310875 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.325309 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.351867 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.364761 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.378829 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.390498 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.399788 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.399876 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.399922 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.399950 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.399968 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.399978 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:08Z","lastTransitionTime":"2026-03-09T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.416600 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"45 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0309 09:09:06.535081 6845 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535093 6845 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.521797ms\\\\nI0309 09:09:06.535099 6845 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-k4kdn in node crc\\\\nI0309 09:09:06.535103 6845 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-k4kdn after 0 failed attempt(s)\\\\nI0309 09:09:06.535109 6845 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535110 6845 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nF0309 09:09:06.535119 6845 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.426047 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.437690 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.451363 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:08Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.502272 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.502320 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.502337 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.502357 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.502373 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:08Z","lastTransitionTime":"2026-03-09T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.604700 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.604844 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.604872 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.604901 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.604924 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:08Z","lastTransitionTime":"2026-03-09T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.661706 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.661749 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.661749 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:08 crc kubenswrapper[4792]: E0309 09:09:08.661842 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.661925 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:08 crc kubenswrapper[4792]: E0309 09:09:08.662111 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:08 crc kubenswrapper[4792]: E0309 09:09:08.662200 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:08 crc kubenswrapper[4792]: E0309 09:09:08.662342 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.707952 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.708005 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.708027 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.708055 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.708107 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:08Z","lastTransitionTime":"2026-03-09T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.811510 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.811558 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.811573 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.811593 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.811608 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:08Z","lastTransitionTime":"2026-03-09T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.914581 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.914630 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.914640 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.914655 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:08 crc kubenswrapper[4792]: I0309 09:09:08.914664 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:08Z","lastTransitionTime":"2026-03-09T09:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.017457 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.017520 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.017531 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.017545 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.017555 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:09Z","lastTransitionTime":"2026-03-09T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.120893 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.120975 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.120992 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.121473 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.121529 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:09Z","lastTransitionTime":"2026-03-09T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.224274 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.224319 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.224331 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.224349 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.224360 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:09Z","lastTransitionTime":"2026-03-09T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.326473 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.326519 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.326530 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.326547 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.326560 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:09Z","lastTransitionTime":"2026-03-09T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.429569 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.429624 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.429646 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.429672 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.429692 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:09Z","lastTransitionTime":"2026-03-09T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.531407 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.531487 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.531508 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.531533 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.531551 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:09Z","lastTransitionTime":"2026-03-09T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.634261 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.634293 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.634303 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.634321 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.634331 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:09Z","lastTransitionTime":"2026-03-09T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.737294 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.737325 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.737335 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.737348 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.737357 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:09Z","lastTransitionTime":"2026-03-09T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.840783 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.840871 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.840893 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.840919 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.840941 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:09Z","lastTransitionTime":"2026-03-09T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.943607 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.943667 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.943683 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.943706 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:09 crc kubenswrapper[4792]: I0309 09:09:09.943723 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:09Z","lastTransitionTime":"2026-03-09T09:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.046617 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.046676 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.046695 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.046724 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.046746 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:10Z","lastTransitionTime":"2026-03-09T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.149774 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.149835 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.149852 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.149876 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.149893 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:10Z","lastTransitionTime":"2026-03-09T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.253566 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.253631 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.253649 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.253684 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.253702 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:10Z","lastTransitionTime":"2026-03-09T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.359164 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.359231 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.359249 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.359279 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.359308 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:10Z","lastTransitionTime":"2026-03-09T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.462763 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.462859 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.462916 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.462940 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.463018 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:10Z","lastTransitionTime":"2026-03-09T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.565967 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.566032 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.566054 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.566118 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.566143 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:10Z","lastTransitionTime":"2026-03-09T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.661828 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.661885 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.661939 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:10 crc kubenswrapper[4792]: E0309 09:09:10.662602 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:10 crc kubenswrapper[4792]: E0309 09:09:10.662678 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:10 crc kubenswrapper[4792]: E0309 09:09:10.662761 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.663346 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:10 crc kubenswrapper[4792]: E0309 09:09:10.663563 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.671399 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.671465 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.671486 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.671525 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.671564 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:10Z","lastTransitionTime":"2026-03-09T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.775816 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.775861 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.775871 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.775887 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.775896 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:10Z","lastTransitionTime":"2026-03-09T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.878497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.878560 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.878578 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.878603 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.878621 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:10Z","lastTransitionTime":"2026-03-09T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.981744 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.981818 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.981839 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.981870 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:10 crc kubenswrapper[4792]: I0309 09:09:10.981893 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:10Z","lastTransitionTime":"2026-03-09T09:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.084916 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.084991 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.085012 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.085035 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.085055 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:11Z","lastTransitionTime":"2026-03-09T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.188850 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.188927 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.188951 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.188984 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.189008 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:11Z","lastTransitionTime":"2026-03-09T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.292405 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.292485 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.292511 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.292548 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.292569 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:11Z","lastTransitionTime":"2026-03-09T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.396177 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.396243 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.396260 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.396284 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.396304 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:11Z","lastTransitionTime":"2026-03-09T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.498914 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.498972 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.498989 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.499011 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.499028 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:11Z","lastTransitionTime":"2026-03-09T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.602010 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.602063 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.602097 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.602116 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.602130 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:11Z","lastTransitionTime":"2026-03-09T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.704570 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.704621 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.704639 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.704662 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.704679 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:11Z","lastTransitionTime":"2026-03-09T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.807322 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.807382 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.807393 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.807408 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.807419 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:11Z","lastTransitionTime":"2026-03-09T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.910807 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.910868 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.910889 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.910920 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:11 crc kubenswrapper[4792]: I0309 09:09:11.910941 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:11Z","lastTransitionTime":"2026-03-09T09:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.013931 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.013995 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.014017 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.014050 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.014107 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:12Z","lastTransitionTime":"2026-03-09T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.118006 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.118045 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.118053 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.118087 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.118096 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:12Z","lastTransitionTime":"2026-03-09T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.220587 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.220710 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.220736 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.220766 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.220788 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:12Z","lastTransitionTime":"2026-03-09T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.323706 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.324022 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.324258 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.324462 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.324666 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:12Z","lastTransitionTime":"2026-03-09T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.427821 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.427890 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.427914 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.427950 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.428105 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:12Z","lastTransitionTime":"2026-03-09T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.531922 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.531993 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.532009 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.532034 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.532051 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:12Z","lastTransitionTime":"2026-03-09T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.635209 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.635279 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.635298 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.635323 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.635340 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:12Z","lastTransitionTime":"2026-03-09T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.662264 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.662291 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.662306 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.662395 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:12 crc kubenswrapper[4792]: E0309 09:09:12.662421 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:12 crc kubenswrapper[4792]: E0309 09:09:12.662533 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:12 crc kubenswrapper[4792]: E0309 09:09:12.662675 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:12 crc kubenswrapper[4792]: E0309 09:09:12.662852 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.679248 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.738725 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.738770 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.738810 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.738837 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.738855 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:12Z","lastTransitionTime":"2026-03-09T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.841557 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.841617 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.841641 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.841670 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.841689 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:12Z","lastTransitionTime":"2026-03-09T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.946902 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.946969 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.946992 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.947022 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:12 crc kubenswrapper[4792]: I0309 09:09:12.947039 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:12Z","lastTransitionTime":"2026-03-09T09:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.049134 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.049416 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.049647 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.049885 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.050138 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:13Z","lastTransitionTime":"2026-03-09T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.153052 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.153160 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.153183 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.153209 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.153234 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:13Z","lastTransitionTime":"2026-03-09T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.256526 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.256751 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.256869 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.256955 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.257030 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:13Z","lastTransitionTime":"2026-03-09T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.359906 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.359958 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.359977 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.359998 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.360015 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:13Z","lastTransitionTime":"2026-03-09T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.463281 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.463632 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.463809 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.463977 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.464250 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:13Z","lastTransitionTime":"2026-03-09T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.566272 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.566312 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.566328 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.566351 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.566367 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:13Z","lastTransitionTime":"2026-03-09T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.668942 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.668989 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.669004 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.669026 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.669043 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:13Z","lastTransitionTime":"2026-03-09T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.771923 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.772919 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.773123 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.773296 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.773420 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:13Z","lastTransitionTime":"2026-03-09T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.876503 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.876783 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.876795 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.876813 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.876825 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:13Z","lastTransitionTime":"2026-03-09T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.979719 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.979998 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.980232 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.980401 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:13 crc kubenswrapper[4792]: I0309 09:09:13.980535 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:13Z","lastTransitionTime":"2026-03-09T09:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.083238 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.083288 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.083302 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.083319 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.083331 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:14Z","lastTransitionTime":"2026-03-09T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.185991 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.186049 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.186106 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.186131 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.186148 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:14Z","lastTransitionTime":"2026-03-09T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.288573 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.288613 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.288623 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.288641 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.288654 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:14Z","lastTransitionTime":"2026-03-09T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.391134 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.391246 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.391265 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.391347 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.391370 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:14Z","lastTransitionTime":"2026-03-09T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.493517 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.493553 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.493563 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.493575 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.493585 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:14Z","lastTransitionTime":"2026-03-09T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.596264 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.596427 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.596516 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.596602 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.596689 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:14Z","lastTransitionTime":"2026-03-09T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.662166 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.662332 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.662184 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.662466 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.662186 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.662570 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.662184 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.662667 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.699516 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.699563 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.699578 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.699609 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.699621 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:14Z","lastTransitionTime":"2026-03-09T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.730860 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.731286 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.731031 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.731401 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.731738 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:09:46.731581554 +0000 UTC m=+151.761782316 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.731696 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.731792 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.731806 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.731814 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:09:46.731769499 +0000 UTC m=+151.761970291 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.731841 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:09:46.73182984 +0000 UTC m=+151.762030602 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.732258 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.802497 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.802569 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.802592 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.802621 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.802642 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:14Z","lastTransitionTime":"2026-03-09T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.833181 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.833441 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:09:46.833409931 +0000 UTC m=+151.863610723 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.833564 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.833720 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.833893 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.833912 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.833975 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs podName:4711cce5-88a9-48c4-8e2e-522062e34a03 nodeName:}" failed. No retries permitted until 2026-03-09 09:09:46.833948715 +0000 UTC m=+151.864149497 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs") pod "network-metrics-daemon-fttpc" (UID: "4711cce5-88a9-48c4-8e2e-522062e34a03") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.833993 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.834127 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:09:14 crc kubenswrapper[4792]: E0309 09:09:14.834245 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:09:46.834220882 +0000 UTC m=+151.864421674 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.905345 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.905388 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.905406 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.905424 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:14 crc kubenswrapper[4792]: I0309 09:09:14.905436 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:14Z","lastTransitionTime":"2026-03-09T09:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.007926 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.007990 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.008015 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.008046 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.008099 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:15Z","lastTransitionTime":"2026-03-09T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.111973 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.112854 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.113252 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.113493 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.113675 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:15Z","lastTransitionTime":"2026-03-09T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.216550 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.216612 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.216633 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.216675 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.216707 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:15Z","lastTransitionTime":"2026-03-09T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.320002 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.320056 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.320111 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.320136 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.320153 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:15Z","lastTransitionTime":"2026-03-09T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.423314 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.423387 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.423405 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.423429 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.423506 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:15Z","lastTransitionTime":"2026-03-09T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.526280 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.526408 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.526426 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.526449 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.526527 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:15Z","lastTransitionTime":"2026-03-09T09:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:15 crc kubenswrapper[4792]: E0309 09:09:15.626849 4792 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.685257 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72206549-3056-4411-ad65-3bfc0456b8a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652ed63f63bc7b81328792679f59e5d748feb5114a97f57df7ba90f3d272feff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:07:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 09:07:17.893452 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 09:07:17.896032 1 observer_polling.go:159] Starting file observer\\\\nI0309 09:07:17.940019 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 09:07:17.944655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 09:07:43.887477 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 09:07:43.887591 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544366c65452de29fd10c69bba980e991f5e2a3a09e98a9e66a050b1a06d4280\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d9cf24a9d5a60bcaea9c6889c23037b8b47d7eb60c2458147579cd9ec75176\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.702775 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.734657 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"45 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0309 09:09:06.535081 6845 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535093 6845 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.521797ms\\\\nI0309 09:09:06.535099 6845 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-k4kdn in node crc\\\\nI0309 09:09:06.535103 6845 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-k4kdn after 0 failed attempt(s)\\\\nI0309 09:09:06.535109 6845 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535110 6845 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nF0309 09:09:06.535119 6845 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.758716 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.774936 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: E0309 09:09:15.776461 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.791455 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.802974 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.818427 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.832434 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.846782 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.862062 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.872171 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.891869 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.904652 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.919025 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.931603 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.946186 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:15 crc kubenswrapper[4792]: I0309 09:09:15.974748 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:15Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.662051 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:16 crc kubenswrapper[4792]: E0309 09:09:16.662331 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.662422 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.662463 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:16 crc kubenswrapper[4792]: E0309 09:09:16.662516 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:16 crc kubenswrapper[4792]: E0309 09:09:16.663103 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.663290 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:16 crc kubenswrapper[4792]: E0309 09:09:16.663469 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.663701 4792 scope.go:117] "RemoveContainer" containerID="420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.687448 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.687534 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.687558 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.687593 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.687630 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:16Z","lastTransitionTime":"2026-03-09T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:16 crc kubenswrapper[4792]: E0309 09:09:16.712486 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:16Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.719845 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.719913 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.719932 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.719965 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.719987 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:16Z","lastTransitionTime":"2026-03-09T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:16 crc kubenswrapper[4792]: E0309 09:09:16.743762 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:16Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.748955 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.749027 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.749094 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.749122 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.749146 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:16Z","lastTransitionTime":"2026-03-09T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:16 crc kubenswrapper[4792]: E0309 09:09:16.770598 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:16Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.777187 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.777248 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.777268 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.777295 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.777318 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:16Z","lastTransitionTime":"2026-03-09T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:16 crc kubenswrapper[4792]: E0309 09:09:16.802241 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:16Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.806939 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.807029 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.807051 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.807121 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:16 crc kubenswrapper[4792]: I0309 09:09:16.807149 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:16Z","lastTransitionTime":"2026-03-09T09:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:16 crc kubenswrapper[4792]: E0309 09:09:16.833269 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:16Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:16 crc kubenswrapper[4792]: E0309 09:09:16.833712 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.210944 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.213434 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc"} Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.214047 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.233025 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.246822 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.262609 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.276101 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.288057 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.298222 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.317396 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"45 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0309 09:09:06.535081 6845 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535093 6845 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.521797ms\\\\nI0309 09:09:06.535099 6845 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-k4kdn in node crc\\\\nI0309 09:09:06.535103 6845 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-k4kdn after 0 failed attempt(s)\\\\nI0309 09:09:06.535109 6845 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535110 6845 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nF0309 09:09:06.535119 6845 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.334005 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.353949 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72206549-3056-4411-ad65-3bfc0456b8a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652ed63f63bc7b81328792679f59e5d748feb5114a97f57df7ba90f3d272feff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:07:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 09:07:17.893452 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 09:07:17.896032 1 observer_polling.go:159] Starting file observer\\\\nI0309 09:07:17.940019 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 09:07:17.944655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 09:07:43.887477 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 09:07:43.887591 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544366c65452de29fd10c69bba980e991f5e2a3a09e98a9e66a050b1a06d4280\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d9cf24a9d5a60bcaea9c6889c23037b8b47d7eb60c2458147579cd9ec75176\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.368032 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.379190 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.388895 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.398979 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.418002 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.430947 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.450192 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.467629 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:17 crc kubenswrapper[4792]: I0309 09:09:17.483160 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:17Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:18 crc kubenswrapper[4792]: I0309 09:09:18.662050 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:18 crc kubenswrapper[4792]: I0309 09:09:18.662172 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:18 crc kubenswrapper[4792]: E0309 09:09:18.662486 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:18 crc kubenswrapper[4792]: I0309 09:09:18.662568 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:18 crc kubenswrapper[4792]: E0309 09:09:18.662716 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:18 crc kubenswrapper[4792]: E0309 09:09:18.662826 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:18 crc kubenswrapper[4792]: I0309 09:09:18.663200 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:18 crc kubenswrapper[4792]: E0309 09:09:18.663681 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:18 crc kubenswrapper[4792]: I0309 09:09:18.664220 4792 scope.go:117] "RemoveContainer" containerID="48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66" Mar 09 09:09:18 crc kubenswrapper[4792]: E0309 09:09:18.664511 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" Mar 09 09:09:20 crc kubenswrapper[4792]: I0309 09:09:20.661354 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:20 crc kubenswrapper[4792]: I0309 09:09:20.661417 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:20 crc kubenswrapper[4792]: I0309 09:09:20.661453 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:20 crc kubenswrapper[4792]: I0309 09:09:20.661572 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:20 crc kubenswrapper[4792]: E0309 09:09:20.661571 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:20 crc kubenswrapper[4792]: E0309 09:09:20.661658 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:20 crc kubenswrapper[4792]: E0309 09:09:20.661806 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:20 crc kubenswrapper[4792]: E0309 09:09:20.661963 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:20 crc kubenswrapper[4792]: E0309 09:09:20.777642 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:09:22 crc kubenswrapper[4792]: I0309 09:09:22.662101 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:22 crc kubenswrapper[4792]: I0309 09:09:22.662145 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:22 crc kubenswrapper[4792]: E0309 09:09:22.662238 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:22 crc kubenswrapper[4792]: I0309 09:09:22.662301 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:22 crc kubenswrapper[4792]: I0309 09:09:22.662340 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:22 crc kubenswrapper[4792]: E0309 09:09:22.662470 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:22 crc kubenswrapper[4792]: E0309 09:09:22.662603 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:22 crc kubenswrapper[4792]: E0309 09:09:22.662726 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:24 crc kubenswrapper[4792]: I0309 09:09:24.661623 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:24 crc kubenswrapper[4792]: I0309 09:09:24.661765 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:24 crc kubenswrapper[4792]: E0309 09:09:24.661907 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:24 crc kubenswrapper[4792]: I0309 09:09:24.662316 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:24 crc kubenswrapper[4792]: I0309 09:09:24.662327 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:24 crc kubenswrapper[4792]: E0309 09:09:24.662422 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:24 crc kubenswrapper[4792]: E0309 09:09:24.662570 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:24 crc kubenswrapper[4792]: E0309 09:09:24.662734 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.688115 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.709861 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.729988 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.746593 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.762836 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: E0309 09:09:25.779466 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.787151 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.809007 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72206549-3056-4411-ad65-3bfc0456b8a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652ed63f63bc7b81328792679f59e5d748feb5114a97f57df7ba90f3d272feff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:07:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 09:07:17.893452 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 09:07:17.896032 1 observer_polling.go:159] Starting file observer\\\\nI0309 09:07:17.940019 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 09:07:17.944655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 09:07:43.887477 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 09:07:43.887591 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544366c65452de29fd10c69bba980e991f5e2a3a09e98a9e66a050b1a06d4280\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d9cf24a9d5a60bcaea9c6889c23037b8b47d7eb60c2458147579cd9ec75176\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.821940 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.838827 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"45 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0309 09:09:06.535081 6845 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535093 6845 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.521797ms\\\\nI0309 09:09:06.535099 6845 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-k4kdn in node crc\\\\nI0309 09:09:06.535103 6845 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-k4kdn after 0 failed attempt(s)\\\\nI0309 09:09:06.535109 6845 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535110 6845 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nF0309 09:09:06.535119 6845 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.847831 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.856438 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.864768 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.881621 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.893769 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.904742 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.917520 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.928939 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:25 crc kubenswrapper[4792]: I0309 09:09:25.937698 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:25Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:26 crc kubenswrapper[4792]: I0309 09:09:26.662227 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:26 crc kubenswrapper[4792]: I0309 09:09:26.662252 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:26 crc kubenswrapper[4792]: I0309 09:09:26.662289 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:26 crc kubenswrapper[4792]: E0309 09:09:26.662350 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:26 crc kubenswrapper[4792]: E0309 09:09:26.662458 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:26 crc kubenswrapper[4792]: E0309 09:09:26.662600 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:26 crc kubenswrapper[4792]: I0309 09:09:26.662942 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:26 crc kubenswrapper[4792]: E0309 09:09:26.663136 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.080312 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.080348 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.080357 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.080369 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.080378 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:27Z","lastTransitionTime":"2026-03-09T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:27 crc kubenswrapper[4792]: E0309 09:09:27.095801 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:27Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.101143 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.101182 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.101196 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.101216 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.101231 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:27Z","lastTransitionTime":"2026-03-09T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:27 crc kubenswrapper[4792]: E0309 09:09:27.120653 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:27Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.123988 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.124020 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.124028 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.124041 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.124063 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:27Z","lastTransitionTime":"2026-03-09T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:27 crc kubenswrapper[4792]: E0309 09:09:27.135834 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:27Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.139572 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.139607 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.139616 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.139628 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.139638 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:27Z","lastTransitionTime":"2026-03-09T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:27 crc kubenswrapper[4792]: E0309 09:09:27.156451 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:27Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.160766 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.160831 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.160851 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.160881 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:27 crc kubenswrapper[4792]: I0309 09:09:27.160899 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:27Z","lastTransitionTime":"2026-03-09T09:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:27 crc kubenswrapper[4792]: E0309 09:09:27.178784 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:27Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:27 crc kubenswrapper[4792]: E0309 09:09:27.178889 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:09:28 crc kubenswrapper[4792]: I0309 09:09:28.661441 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:28 crc kubenswrapper[4792]: I0309 09:09:28.661495 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:28 crc kubenswrapper[4792]: I0309 09:09:28.661495 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:28 crc kubenswrapper[4792]: I0309 09:09:28.661584 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:28 crc kubenswrapper[4792]: E0309 09:09:28.661675 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:28 crc kubenswrapper[4792]: E0309 09:09:28.661932 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:28 crc kubenswrapper[4792]: E0309 09:09:28.661997 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:28 crc kubenswrapper[4792]: E0309 09:09:28.662140 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:29 crc kubenswrapper[4792]: I0309 09:09:29.663895 4792 scope.go:117] "RemoveContainer" containerID="48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.228490 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.244950 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.258588 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/2.log" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.261532 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerStarted","Data":"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf"} Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.261971 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.264796 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.279373 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.291482 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.303500 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.324182 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.341110 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72206549-3056-4411-ad65-3bfc0456b8a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652ed63f63bc7b81328792679f59e5d748feb5114a97f57df7ba90f3d272feff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:07:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 09:07:17.893452 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 09:07:17.896032 1 observer_polling.go:159] Starting file observer\\\\nI0309 09:07:17.940019 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 09:07:17.944655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 09:07:43.887477 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 09:07:43.887591 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544366c65452de29fd10c69bba980e991f5e2a3a09e98a9e66a050b1a06d4280\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d9cf24a9d5a60bcaea9c6889c23037b8b47d7eb60c2458147579cd9ec75176\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.352828 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.376913 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"45 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0309 09:09:06.535081 6845 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535093 6845 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.521797ms\\\\nI0309 09:09:06.535099 6845 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-k4kdn in node crc\\\\nI0309 09:09:06.535103 6845 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-k4kdn after 0 failed attempt(s)\\\\nI0309 09:09:06.535109 6845 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535110 6845 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nF0309 09:09:06.535119 6845 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.392500 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.405599 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.415468 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.442989 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.456289 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.472430 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.487429 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.504351 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.518957 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.532028 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.546324 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.562169 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.574210 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.596729 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.610727 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.621407 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.631519 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.641210 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.651640 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.661685 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.661704 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.661704 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.661841 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:30 crc kubenswrapper[4792]: E0309 09:09:30.661929 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:30 crc kubenswrapper[4792]: E0309 09:09:30.661836 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:30 crc kubenswrapper[4792]: E0309 09:09:30.661995 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:30 crc kubenswrapper[4792]: E0309 09:09:30.662035 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.662043 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.670896 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.686882 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"45 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0309 09:09:06.535081 6845 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535093 6845 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.521797ms\\\\nI0309 09:09:06.535099 6845 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-k4kdn in node crc\\\\nI0309 09:09:06.535103 6845 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-k4kdn after 0 failed attempt(s)\\\\nI0309 09:09:06.535109 6845 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535110 6845 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nF0309 09:09:06.535119 6845 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.699188 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.709602 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72206549-3056-4411-ad65-3bfc0456b8a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652ed63f63bc7b81328792679f59e5d748feb5114a97f57df7ba90f3d272feff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:07:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 09:07:17.893452 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 09:07:17.896032 1 observer_polling.go:159] Starting file observer\\\\nI0309 09:07:17.940019 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 09:07:17.944655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 09:07:43.887477 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 09:07:43.887591 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544366c65452de29fd10c69bba980e991f5e2a3a09e98a9e66a050b1a06d4280\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d9cf24a9d5a60bcaea9c6889c23037b8b47d7eb60c2458147579cd9ec75176\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.718396 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.727590 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: I0309 09:09:30.736214 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:30 crc kubenswrapper[4792]: E0309 09:09:30.780552 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.268024 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/3.log" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.269299 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/2.log" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.273997 4792 generic.go:334] "Generic (PLEG): container finished" podID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerID="1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf" exitCode=1 Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.274059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf"} Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.274139 4792 scope.go:117] "RemoveContainer" containerID="48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.274872 4792 scope.go:117] "RemoveContainer" containerID="1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf" Mar 09 09:09:31 crc kubenswrapper[4792]: E0309 09:09:31.275088 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.295239 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.309378 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.326696 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.361352 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.375062 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.387737 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.402185 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.418786 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.430621 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.440887 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.451162 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.461803 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.471545 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.480516 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.491450 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.502611 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72206549-3056-4411-ad65-3bfc0456b8a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652ed63f63bc7b81328792679f59e5d748feb5114a97f57df7ba90f3d272feff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:07:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 09:07:17.893452 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 09:07:17.896032 1 observer_polling.go:159] Starting file observer\\\\nI0309 09:07:17.940019 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 09:07:17.944655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 09:07:43.887477 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 09:07:43.887591 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544366c65452de29fd10c69bba980e991f5e2a3a09e98a9e66a050b1a06d4280\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d9cf24a9d5a60bcaea9c6889c23037b8b47d7eb60c2458147579cd9ec75176\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.511601 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:31 crc kubenswrapper[4792]: I0309 09:09:31.530797 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"45 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0309 09:09:06.535081 6845 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535093 6845 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.521797ms\\\\nI0309 09:09:06.535099 6845 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-k4kdn in node crc\\\\nI0309 09:09:06.535103 6845 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-k4kdn after 0 failed attempt(s)\\\\nI0309 09:09:06.535109 6845 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535110 6845 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nF0309 09:09:06.535119 6845 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"message\\\":\\\" failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z]\\\\nI0309 09:09:30.577419 7126 services_controller.go:434] Service openshift-machine-config-operator/machine-config-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-operator openshift-machine-config-operator 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075535db \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:31Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.279753 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgtc9_626ea896-2e5c-4478-a7be-34a19acc242d/kube-multus/0.log" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.279828 4792 generic.go:334] "Generic (PLEG): container finished" podID="626ea896-2e5c-4478-a7be-34a19acc242d" containerID="57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a" exitCode=1 Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.279919 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgtc9" event={"ID":"626ea896-2e5c-4478-a7be-34a19acc242d","Type":"ContainerDied","Data":"57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a"} Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.280465 4792 scope.go:117] "RemoveContainer" containerID="57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.286641 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/3.log" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.294594 4792 scope.go:117] "RemoveContainer" containerID="1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf" Mar 09 09:09:32 crc kubenswrapper[4792]: E0309 09:09:32.294861 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.303807 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.325811 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:31Z\\\",\\\"message\\\":\\\"2026-03-09T09:08:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_54d81461-dbc7-41f1-80f8-6581a60632be\\\\n2026-03-09T09:08:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_54d81461-dbc7-41f1-80f8-6581a60632be to /host/opt/cni/bin/\\\\n2026-03-09T09:08:46Z [verbose] multus-daemon started\\\\n2026-03-09T09:08:46Z [verbose] Readiness Indicator file check\\\\n2026-03-09T09:09:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.350578 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.370842 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.404808 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.421282 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.436846 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.449504 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.468679 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.484392 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.497644 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.509544 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.528549 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ec991bd21bc3115bde4a99b484cd6e45e2e97d8a3a28dbbdf4e1d8323c6b66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:06Z\\\",\\\"message\\\":\\\"45 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0309 09:09:06.535081 6845 obj_retry.go:365] Adding new object: *v1.Pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535093 6845 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.521797ms\\\\nI0309 09:09:06.535099 6845 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-k4kdn in node crc\\\\nI0309 09:09:06.535103 6845 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-k4kdn after 0 failed attempt(s)\\\\nI0309 09:09:06.535109 6845 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-k4kdn\\\\nI0309 09:09:06.535110 6845 services_controller.go:356] Processing sync for service openshift-multus/multus-admission-controller for network=default\\\\nF0309 09:09:06.535119 6845 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"message\\\":\\\" failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z]\\\\nI0309 09:09:30.577419 7126 services_controller.go:434] Service openshift-machine-config-operator/machine-config-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-operator openshift-machine-config-operator 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075535db \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.541964 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.553400 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72206549-3056-4411-ad65-3bfc0456b8a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652ed63f63bc7b81328792679f59e5d748feb5114a97f57df7ba90f3d272feff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:07:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 09:07:17.893452 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 09:07:17.896032 1 observer_polling.go:159] Starting file observer\\\\nI0309 09:07:17.940019 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 09:07:17.944655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 09:07:43.887477 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 09:07:43.887591 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544366c65452de29fd10c69bba980e991f5e2a3a09e98a9e66a050b1a06d4280\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d9cf24a9d5a60bcaea9c6889c23037b8b47d7eb60c2458147579cd9ec75176\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.566520 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.583215 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.596455 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.609471 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.641877 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.661678 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:32 crc kubenswrapper[4792]: E0309 09:09:32.661795 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.661957 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:32 crc kubenswrapper[4792]: E0309 09:09:32.662000 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.662117 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:32 crc kubenswrapper[4792]: E0309 09:09:32.662166 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.662246 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:32 crc kubenswrapper[4792]: E0309 09:09:32.662287 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.663395 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.679664 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.694563 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:31Z\\\",\\\"message\\\":\\\"2026-03-09T09:08:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_54d81461-dbc7-41f1-80f8-6581a60632be\\\\n2026-03-09T09:08:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_54d81461-dbc7-41f1-80f8-6581a60632be to /host/opt/cni/bin/\\\\n2026-03-09T09:08:46Z [verbose] multus-daemon started\\\\n2026-03-09T09:08:46Z [verbose] Readiness Indicator file check\\\\n2026-03-09T09:09:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.744347 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.765096 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.787832 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.806643 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.815388 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.828487 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.841981 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.853706 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72206549-3056-4411-ad65-3bfc0456b8a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652ed63f63bc7b81328792679f59e5d748feb5114a97f57df7ba90f3d272feff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:07:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 09:07:17.893452 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 09:07:17.896032 1 observer_polling.go:159] Starting file observer\\\\nI0309 09:07:17.940019 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 09:07:17.944655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 09:07:43.887477 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 09:07:43.887591 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544366c65452de29fd10c69bba980e991f5e2a3a09e98a9e66a050b1a06d4280\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d9cf24a9d5a60bcaea9c6889c23037b8b47d7eb60c2458147579cd9ec75176\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.864005 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.882113 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"message\\\":\\\" failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z]\\\\nI0309 09:09:30.577419 7126 services_controller.go:434] Service openshift-machine-config-operator/machine-config-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-operator openshift-machine-config-operator 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075535db \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.892578 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.901441 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:32 crc kubenswrapper[4792]: I0309 09:09:32.910346 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:32Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.324037 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgtc9_626ea896-2e5c-4478-a7be-34a19acc242d/kube-multus/0.log" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.324182 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgtc9" event={"ID":"626ea896-2e5c-4478-a7be-34a19acc242d","Type":"ContainerStarted","Data":"1062e61dca9fb971dffc9cd101c8b11ac94fc421dad88dc86f8e9df3fa2c93c4"} Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.337345 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.349688 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.363110 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.392118 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.409509 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.426228 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.443571 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1062e61dca9fb971dffc9cd101c8b11ac94fc421dad88dc86f8e9df3fa2c93c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:31Z\\\",\\\"message\\\":\\\"2026-03-09T09:08:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_54d81461-dbc7-41f1-80f8-6581a60632be\\\\n2026-03-09T09:08:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_54d81461-dbc7-41f1-80f8-6581a60632be to /host/opt/cni/bin/\\\\n2026-03-09T09:08:46Z [verbose] multus-daemon started\\\\n2026-03-09T09:08:46Z [verbose] Readiness Indicator file check\\\\n2026-03-09T09:09:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.460437 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.476021 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.489207 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.500872 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.510889 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.521117 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.533014 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.547260 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.559135 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72206549-3056-4411-ad65-3bfc0456b8a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652ed63f63bc7b81328792679f59e5d748feb5114a97f57df7ba90f3d272feff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:07:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 09:07:17.893452 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 09:07:17.896032 1 observer_polling.go:159] Starting file observer\\\\nI0309 09:07:17.940019 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 09:07:17.944655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 09:07:43.887477 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 09:07:43.887591 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544366c65452de29fd10c69bba980e991f5e2a3a09e98a9e66a050b1a06d4280\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d9cf24a9d5a60bcaea9c6889c23037b8b47d7eb60c2458147579cd9ec75176\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.569369 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:33 crc kubenswrapper[4792]: I0309 09:09:33.592594 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"message\\\":\\\" failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z]\\\\nI0309 09:09:30.577419 7126 services_controller.go:434] Service openshift-machine-config-operator/machine-config-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-operator openshift-machine-config-operator 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075535db \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:33Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:34 crc kubenswrapper[4792]: I0309 09:09:34.661739 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:34 crc kubenswrapper[4792]: I0309 09:09:34.661749 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:34 crc kubenswrapper[4792]: I0309 09:09:34.661744 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:34 crc kubenswrapper[4792]: E0309 09:09:34.662171 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:34 crc kubenswrapper[4792]: E0309 09:09:34.661926 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:34 crc kubenswrapper[4792]: I0309 09:09:34.661780 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:34 crc kubenswrapper[4792]: E0309 09:09:34.662264 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:34 crc kubenswrapper[4792]: E0309 09:09:34.662370 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.679109 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.704536 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.719480 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.734300 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.751796 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.769215 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: E0309 09:09:35.781593 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.791914 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72206549-3056-4411-ad65-3bfc0456b8a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652ed63f63bc7b81328792679f59e5d748feb5114a97f57df7ba90f3d272feff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:07:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 09:07:17.893452 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 09:07:17.896032 1 observer_polling.go:159] Starting file observer\\\\nI0309 09:07:17.940019 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 09:07:17.944655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 09:07:43.887477 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 09:07:43.887591 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544366c65452de29fd10c69bba980e991f5e2a3a09e98a9e66a050b1a06d4280\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d9cf24a9d5a60bcaea9c6889c23037b8b47d7eb60c2458147579cd9ec75176\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.806984 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.829275 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"message\\\":\\\" failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z]\\\\nI0309 09:09:30.577419 7126 services_controller.go:434] Service openshift-machine-config-operator/machine-config-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-operator openshift-machine-config-operator 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075535db \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.840238 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.848789 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.857887 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.875770 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.888876 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.903779 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.918608 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1062e61dca9fb971dffc9cd101c8b11ac94fc421dad88dc86f8e9df3fa2c93c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:31Z\\\",\\\"message\\\":\\\"2026-03-09T09:08:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_54d81461-dbc7-41f1-80f8-6581a60632be\\\\n2026-03-09T09:08:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_54d81461-dbc7-41f1-80f8-6581a60632be to /host/opt/cni/bin/\\\\n2026-03-09T09:08:46Z [verbose] multus-daemon started\\\\n2026-03-09T09:08:46Z [verbose] Readiness Indicator file check\\\\n2026-03-09T09:09:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.933126 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:35 crc kubenswrapper[4792]: I0309 09:09:35.972145 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:35Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:36 crc kubenswrapper[4792]: I0309 09:09:36.661893 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:36 crc kubenswrapper[4792]: E0309 09:09:36.662019 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:36 crc kubenswrapper[4792]: I0309 09:09:36.661898 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:36 crc kubenswrapper[4792]: I0309 09:09:36.662027 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:36 crc kubenswrapper[4792]: E0309 09:09:36.662129 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:36 crc kubenswrapper[4792]: E0309 09:09:36.662180 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:36 crc kubenswrapper[4792]: I0309 09:09:36.661924 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:36 crc kubenswrapper[4792]: E0309 09:09:36.662245 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.501570 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.501655 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.501674 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.501704 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.501729 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:37Z","lastTransitionTime":"2026-03-09T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:37 crc kubenswrapper[4792]: E0309 09:09:37.526704 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.534866 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.534932 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.534950 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.534977 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.534995 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:37Z","lastTransitionTime":"2026-03-09T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:37 crc kubenswrapper[4792]: E0309 09:09:37.556471 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.560910 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.560964 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.560981 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.561007 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.561025 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:37Z","lastTransitionTime":"2026-03-09T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:37 crc kubenswrapper[4792]: E0309 09:09:37.579111 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.585176 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.585236 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.585253 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.585297 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.585336 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:37Z","lastTransitionTime":"2026-03-09T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:37 crc kubenswrapper[4792]: E0309 09:09:37.602801 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.614114 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.614181 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.614193 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.614214 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.614229 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:37Z","lastTransitionTime":"2026-03-09T09:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:37 crc kubenswrapper[4792]: E0309 09:09:37.629762 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:37Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:37 crc kubenswrapper[4792]: E0309 09:09:37.630138 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:09:37 crc kubenswrapper[4792]: I0309 09:09:37.676084 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 09 09:09:38 crc kubenswrapper[4792]: I0309 09:09:38.662267 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:38 crc kubenswrapper[4792]: I0309 09:09:38.663335 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:38 crc kubenswrapper[4792]: E0309 09:09:38.663507 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:38 crc kubenswrapper[4792]: I0309 09:09:38.663556 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:38 crc kubenswrapper[4792]: E0309 09:09:38.663666 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:38 crc kubenswrapper[4792]: I0309 09:09:38.663781 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:38 crc kubenswrapper[4792]: E0309 09:09:38.663866 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:38 crc kubenswrapper[4792]: E0309 09:09:38.664027 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:40 crc kubenswrapper[4792]: I0309 09:09:40.662188 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:40 crc kubenswrapper[4792]: I0309 09:09:40.662275 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:40 crc kubenswrapper[4792]: I0309 09:09:40.662320 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:40 crc kubenswrapper[4792]: I0309 09:09:40.662328 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:40 crc kubenswrapper[4792]: E0309 09:09:40.662686 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:40 crc kubenswrapper[4792]: E0309 09:09:40.662919 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:40 crc kubenswrapper[4792]: E0309 09:09:40.662982 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:40 crc kubenswrapper[4792]: E0309 09:09:40.663115 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:40 crc kubenswrapper[4792]: E0309 09:09:40.782719 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:09:42 crc kubenswrapper[4792]: I0309 09:09:42.662296 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:42 crc kubenswrapper[4792]: I0309 09:09:42.662350 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:42 crc kubenswrapper[4792]: I0309 09:09:42.662434 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:42 crc kubenswrapper[4792]: I0309 09:09:42.662296 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:42 crc kubenswrapper[4792]: E0309 09:09:42.662505 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:42 crc kubenswrapper[4792]: E0309 09:09:42.662641 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:42 crc kubenswrapper[4792]: E0309 09:09:42.662803 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:42 crc kubenswrapper[4792]: E0309 09:09:42.662940 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:44 crc kubenswrapper[4792]: I0309 09:09:44.662008 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:44 crc kubenswrapper[4792]: I0309 09:09:44.662015 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:44 crc kubenswrapper[4792]: I0309 09:09:44.662018 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:44 crc kubenswrapper[4792]: I0309 09:09:44.662112 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:44 crc kubenswrapper[4792]: E0309 09:09:44.662217 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:44 crc kubenswrapper[4792]: E0309 09:09:44.662357 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:44 crc kubenswrapper[4792]: E0309 09:09:44.662516 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:44 crc kubenswrapper[4792]: E0309 09:09:44.662602 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.685015 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.704854 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vgtc9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"626ea896-2e5c-4478-a7be-34a19acc242d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1062e61dca9fb971dffc9cd101c8b11ac94fc421dad88dc86f8e9df3fa2c93c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:31Z\\\",\\\"message\\\":\\\"2026-03-09T09:08:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_54d81461-dbc7-41f1-80f8-6581a60632be\\\\n2026-03-09T09:08:46+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_54d81461-dbc7-41f1-80f8-6581a60632be to /host/opt/cni/bin/\\\\n2026-03-09T09:08:46Z [verbose] multus-daemon started\\\\n2026-03-09T09:08:46Z [verbose] Readiness Indicator file check\\\\n2026-03-09T09:09:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4rw87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vgtc9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.729728 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4tprh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b59ff3b-540d-4385-b02b-f68349bb74bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54d1a12d5c015eebcdc6e5d6e253c40a4dd476bbc12be0c63665aa6bc091f72f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54059591e4b093ca2c60bc4ea9f0b0a6c44077e07ebaa66bab76d10f0d2f0f9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86174a85f5a0f642224f0a1f175358d157641043ca6e1b88e119e7de171b5582\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb6462fecadedf78b23e5a2276698144ffea837649f3c15664a573c9724f7780\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42715848a7dace5208b36242c41aaa8f3c96fe5f7f6320b8acd5f2ea52a338bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b4eb0778c3ebce9cfec4abfead327d02f8d401aa11ce1578a7cbba1b9fe8854\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87e9a77e1689ff7f403824349c08b6d6c85b408875ff852f56090cf3fe2bb487\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d4bnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4tprh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.750982 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ceed39a1-2e4f-4630-bda0-57071ac26ee4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://648c51e35617b16c1bc2e0f86ea31ee9256943628595864fc27951fce17197cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a6364e9d51e6626caf8b49f387f23eb4bfdfc13e149fb8847b7a1dd1637bf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7tn4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ssk9p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.774102 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a861a55a-f3b0-4a55-a961-7798ef57d3c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48162a23c55321a8b6855318c9d71661a5cca0af913d359711cf6685b332994e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58926f05ec9e42e1dc7aa5c3ab9950537c031f3838fb18036ce2a84b2b2ce147\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc1ddaa2a5291284d4281d64b3d0aebf06cee6ef23da3a608ca06b240c8e4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a8bf3c896aae857de56db687c87ea164520667dc71fa495e3c396b478ab472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9d3cba4dfbe99ce715a9ee40af9171a41ccb193306f9b106588cc4b5f620e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea228ca60beeed2978d4efcc092828a381aaf0e00e0baf2a733c75cb15642aac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e86823681f24457a83404aa4739988af2b8abf63506d49c1db09bee7501a7548\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12c05ec878f2865338912d3d41f970b88835d7e7d9719a119eb4d06955850617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: E0309 09:09:45.783439 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.798804 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da622175994cf951cfa730455ea5a163271bc993fc5a1268ed072b944a612524\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.812980 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.828745 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd11045a-d746-4b42-872c-8b8d1dd2d515\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96248640b95891716f26355ace06d60da675ab3aa8086e6a7c94ad528fc1357d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zdqpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-97tth\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.848475 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab83a4df67ef256deece5dabe54496035a360833d9b5e926a1045196734d2c24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.865845 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4403f41d10eba801e3729ecffdc397e603f3c4899d1a62e02cb35418645d1ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://655b419f4453a7f0fff411f579326c7fda157c08267b1dcf23f7e1d11b684c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.878968 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.893628 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba4e576-a76c-47d5-b0d0-423915a13c1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://009a3753ff77a9835a1148b320e30a1456103acfa142693bc6835d86911c3f96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fed911be84528f8d1ff84a7935ed4cec34862d88edfc4fe0315e0c4146c2fceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e5e8d9946c1c7ec4aa7ad59c1c8630fb44d2e7312e4a3b9c4e11d821fd76c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2775cf70a937d3b2d439abcc43d3f09389f296ff4e0b9339b30f9b2f4d5a28bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2775cf70a937d3b2d439abcc43d3f09389f296ff4e0b9339b30f9b2f4d5a28bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.907110 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-fgk47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84b6eb44-ca33-41a6-a951-2c66688ad860\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3b4aea0ac4cc05af599b411f32e712d0955e5052afc55c0d0be66e9a1249223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cpkhp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-fgk47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.934003 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"message\\\":\\\" failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:30Z is after 2025-08-24T17:21:41Z]\\\\nI0309 09:09:30.577419 7126 services_controller.go:434] Service openshift-machine-config-operator/machine-config-operator retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{machine-config-operator openshift-machine-config-operator 8bc1afc2-8724-4135-84df-aee09f23af4c 4514 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[k8s-app:machine-config-operator] map[include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:mco-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0075535db \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Na\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:09:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:08:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gmxpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lfm2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.948630 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7e46817-10cf-448c-8a2a-154f1c322ce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:08:27Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 09:08:27.070870 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 09:08:27.070998 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 09:08:27.071818 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2825204129/tls.crt::/tmp/serving-cert-2825204129/tls.key\\\\\\\"\\\\nI0309 09:08:27.485720 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 09:08:27.487343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 09:08:27.487361 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 09:08:27.487383 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 09:08:27.487388 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 09:08:27.490810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0309 09:08:27.490923 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490952 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 09:08:27.490979 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 09:08:27.491005 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 09:08:27.491032 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 09:08:27.491060 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0309 09:08:27.490826 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0309 09:08:27.491911 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:08:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:09:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.961670 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72206549-3056-4411-ad65-3bfc0456b8a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://652ed63f63bc7b81328792679f59e5d748feb5114a97f57df7ba90f3d272feff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f14066d111e6fa1a8c98be79fb37f7a32d143d503ec51e49a36d039b7d464b4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T09:07:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 09:07:17.893452 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 09:07:17.896032 1 observer_polling.go:159] Starting file observer\\\\nI0309 09:07:17.940019 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 09:07:17.944655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 09:07:43.887477 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 09:07:43.887591 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544366c65452de29fd10c69bba980e991f5e2a3a09e98a9e66a050b1a06d4280\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3d9cf24a9d5a60bcaea9c6889c23037b8b47d7eb60c2458147579cd9ec75176\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.975264 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fttpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4711cce5-88a9-48c4-8e2e-522062e34a03\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjrbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fttpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:45 crc kubenswrapper[4792]: I0309 09:09:45.989545 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"274366f4-bdf7-4516-9559-b90b9947999e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f6d16099e4ca2921b039c7fac87c3a9b3ee4780783ad52207c77ea2891942d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e023bbfb6d4a1c42830654b83b26c24cddefd808ee765ebef670e8b10910b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:07:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:07:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:45Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:46 crc kubenswrapper[4792]: I0309 09:09:46.003997 4792 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4kdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cb7634b-66b7-4541-8e53-3e01a6cb41ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:08:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d395567d530702b468a31ec780cf0dcf356e2a07f4a28ee50329b702d8e53594\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vh5jf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:08:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4kdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:46Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:46 crc kubenswrapper[4792]: I0309 09:09:46.661815 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:46 crc kubenswrapper[4792]: I0309 09:09:46.661837 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:46 crc kubenswrapper[4792]: I0309 09:09:46.661837 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:46 crc kubenswrapper[4792]: I0309 09:09:46.661998 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.662254 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.662435 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.662559 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.662708 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:46 crc kubenswrapper[4792]: I0309 09:09:46.768767 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:46 crc kubenswrapper[4792]: I0309 09:09:46.768871 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:46 crc kubenswrapper[4792]: I0309 09:09:46.768911 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.769024 4792 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.769093 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.769144 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.769161 4792 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.769167 4792 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.769181 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:10:50.7691557 +0000 UTC m=+215.799356512 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.769338 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:10:50.769300364 +0000 UTC m=+215.799501196 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.769382 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:10:50.769365595 +0000 UTC m=+215.799566477 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:09:46 crc kubenswrapper[4792]: I0309 09:09:46.870271 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.870583 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:50.870541625 +0000 UTC m=+215.900742417 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:09:46 crc kubenswrapper[4792]: I0309 09:09:46.870697 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:46 crc kubenswrapper[4792]: I0309 09:09:46.870844 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.870905 4792 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.871026 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.871052 4792 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.871103 4792 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.871031 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs podName:4711cce5-88a9-48c4-8e2e-522062e34a03 nodeName:}" failed. No retries permitted until 2026-03-09 09:10:50.870989888 +0000 UTC m=+215.901190680 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs") pod "network-metrics-daemon-fttpc" (UID: "4711cce5-88a9-48c4-8e2e-522062e34a03") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:09:46 crc kubenswrapper[4792]: E0309 09:09:46.871194 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:10:50.871175693 +0000 UTC m=+215.901376485 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.663881 4792 scope.go:117] "RemoveContainer" containerID="1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf" Mar 09 09:09:47 crc kubenswrapper[4792]: E0309 09:09:47.664782 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.791971 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.792007 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.792018 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.792032 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.792041 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:47Z","lastTransitionTime":"2026-03-09T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:47 crc kubenswrapper[4792]: E0309 09:09:47.808003 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.812046 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.812116 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.812125 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.812139 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.812149 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:47Z","lastTransitionTime":"2026-03-09T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:47 crc kubenswrapper[4792]: E0309 09:09:47.825177 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.830484 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.830523 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.830532 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.830547 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.830558 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:47Z","lastTransitionTime":"2026-03-09T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:47 crc kubenswrapper[4792]: E0309 09:09:47.844131 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.848698 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.848775 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.848795 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.848823 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.848842 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:47Z","lastTransitionTime":"2026-03-09T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:47 crc kubenswrapper[4792]: E0309 09:09:47.863995 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.868506 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.868551 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.868562 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.868580 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:47 crc kubenswrapper[4792]: I0309 09:09:47.868595 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:47Z","lastTransitionTime":"2026-03-09T09:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:47 crc kubenswrapper[4792]: E0309 09:09:47.883104 4792 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148064Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608864Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e3b5ac96-f3df-45c5-a4ac-24aa5703690c\\\",\\\"systemUUID\\\":\\\"838abbcf-5467-42bb-9eb7-be30fe4962bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:09:47Z is after 2025-08-24T17:21:41Z" Mar 09 09:09:47 crc kubenswrapper[4792]: E0309 09:09:47.883213 4792 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:09:48 crc kubenswrapper[4792]: I0309 09:09:48.661285 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:48 crc kubenswrapper[4792]: I0309 09:09:48.661343 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:48 crc kubenswrapper[4792]: I0309 09:09:48.661304 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:48 crc kubenswrapper[4792]: E0309 09:09:48.661416 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:48 crc kubenswrapper[4792]: I0309 09:09:48.661449 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:48 crc kubenswrapper[4792]: E0309 09:09:48.661561 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:48 crc kubenswrapper[4792]: E0309 09:09:48.661643 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:48 crc kubenswrapper[4792]: E0309 09:09:48.661700 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:50 crc kubenswrapper[4792]: I0309 09:09:50.662437 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:50 crc kubenswrapper[4792]: I0309 09:09:50.662488 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:50 crc kubenswrapper[4792]: E0309 09:09:50.662629 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:50 crc kubenswrapper[4792]: I0309 09:09:50.662659 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:50 crc kubenswrapper[4792]: I0309 09:09:50.662487 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:50 crc kubenswrapper[4792]: E0309 09:09:50.662797 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:50 crc kubenswrapper[4792]: E0309 09:09:50.662869 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:50 crc kubenswrapper[4792]: E0309 09:09:50.662942 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:50 crc kubenswrapper[4792]: E0309 09:09:50.785109 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:09:52 crc kubenswrapper[4792]: I0309 09:09:52.661752 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:52 crc kubenswrapper[4792]: I0309 09:09:52.661844 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:52 crc kubenswrapper[4792]: I0309 09:09:52.661795 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:52 crc kubenswrapper[4792]: I0309 09:09:52.661752 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:52 crc kubenswrapper[4792]: E0309 09:09:52.662186 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:52 crc kubenswrapper[4792]: E0309 09:09:52.662322 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:52 crc kubenswrapper[4792]: E0309 09:09:52.662492 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:52 crc kubenswrapper[4792]: E0309 09:09:52.662618 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:54 crc kubenswrapper[4792]: I0309 09:09:54.661647 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:54 crc kubenswrapper[4792]: I0309 09:09:54.661705 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:54 crc kubenswrapper[4792]: E0309 09:09:54.661898 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:54 crc kubenswrapper[4792]: I0309 09:09:54.661954 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:54 crc kubenswrapper[4792]: I0309 09:09:54.662039 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:54 crc kubenswrapper[4792]: E0309 09:09:54.662175 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:54 crc kubenswrapper[4792]: E0309 09:09:54.662334 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:54 crc kubenswrapper[4792]: E0309 09:09:54.662550 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:55 crc kubenswrapper[4792]: I0309 09:09:55.695555 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.695542525 podStartE2EDuration="1m8.695542525s" podCreationTimestamp="2026-03-09 09:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:55.695336259 +0000 UTC m=+160.725537011" watchObservedRunningTime="2026-03-09 09:09:55.695542525 +0000 UTC m=+160.725743277" Mar 09 09:09:55 crc kubenswrapper[4792]: I0309 09:09:55.741492 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=43.741472862 podStartE2EDuration="43.741472862s" podCreationTimestamp="2026-03-09 09:09:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:55.721688293 +0000 UTC m=+160.751889045" watchObservedRunningTime="2026-03-09 09:09:55.741472862 +0000 UTC m=+160.771673614" Mar 09 09:09:55 crc kubenswrapper[4792]: I0309 09:09:55.759485 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=18.759458651 podStartE2EDuration="18.759458651s" podCreationTimestamp="2026-03-09 09:09:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:55.74311425 +0000 UTC m=+160.773315022" watchObservedRunningTime="2026-03-09 09:09:55.759458651 +0000 UTC m=+160.789659413" Mar 09 09:09:55 crc kubenswrapper[4792]: I0309 09:09:55.759641 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fgk47" podStartSLOduration=118.759635446 podStartE2EDuration="1m58.759635446s" podCreationTimestamp="2026-03-09 09:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:55.759103801 +0000 UTC m=+160.789304563" watchObservedRunningTime="2026-03-09 09:09:55.759635446 +0000 UTC m=+160.789836218" Mar 09 09:09:55 crc kubenswrapper[4792]: E0309 09:09:55.786671 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:09:55 crc kubenswrapper[4792]: I0309 09:09:55.808877 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=67.808858047 podStartE2EDuration="1m7.808858047s" podCreationTimestamp="2026-03-09 09:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:55.808147947 +0000 UTC m=+160.838348719" watchObservedRunningTime="2026-03-09 09:09:55.808858047 +0000 UTC m=+160.839058799" Mar 09 09:09:55 crc kubenswrapper[4792]: I0309 09:09:55.818873 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-k4kdn" podStartSLOduration=118.81885626 podStartE2EDuration="1m58.81885626s" podCreationTimestamp="2026-03-09 09:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:55.818410917 +0000 UTC m=+160.848611679" watchObservedRunningTime="2026-03-09 09:09:55.81885626 +0000 UTC m=+160.849057032" Mar 09 09:09:55 crc kubenswrapper[4792]: I0309 09:09:55.843350 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ssk9p" podStartSLOduration=117.843329851 podStartE2EDuration="1m57.843329851s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:55.842899949 +0000 UTC m=+160.873100701" watchObservedRunningTime="2026-03-09 09:09:55.843329851 +0000 UTC m=+160.873530613" Mar 09 09:09:55 crc kubenswrapper[4792]: I0309 09:09:55.868878 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=57.868858082 podStartE2EDuration="57.868858082s" podCreationTimestamp="2026-03-09 09:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:55.867498214 +0000 UTC m=+160.897699006" watchObservedRunningTime="2026-03-09 09:09:55.868858082 +0000 UTC m=+160.899058844" Mar 09 09:09:55 crc kubenswrapper[4792]: I0309 09:09:55.934401 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4tprh" podStartSLOduration=117.934384324 podStartE2EDuration="1m57.934384324s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:55.933972753 +0000 UTC m=+160.964173515" watchObservedRunningTime="2026-03-09 09:09:55.934384324 +0000 UTC m=+160.964585076" Mar 09 09:09:55 crc kubenswrapper[4792]: I0309 09:09:55.934536 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vgtc9" podStartSLOduration=117.934531068 podStartE2EDuration="1m57.934531068s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:55.916642472 +0000 UTC m=+160.946843234" watchObservedRunningTime="2026-03-09 09:09:55.934531068 +0000 UTC m=+160.964731820" Mar 09 09:09:55 crc kubenswrapper[4792]: I0309 09:09:55.988329 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podStartSLOduration=118.988314748 podStartE2EDuration="1m58.988314748s" podCreationTimestamp="2026-03-09 09:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:55.987895706 +0000 UTC m=+161.018096458" watchObservedRunningTime="2026-03-09 09:09:55.988314748 +0000 UTC m=+161.018515500" Mar 09 09:09:56 crc kubenswrapper[4792]: I0309 09:09:56.661428 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:56 crc kubenswrapper[4792]: I0309 09:09:56.661459 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:56 crc kubenswrapper[4792]: I0309 09:09:56.661461 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:56 crc kubenswrapper[4792]: I0309 09:09:56.661432 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:56 crc kubenswrapper[4792]: E0309 09:09:56.661575 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:56 crc kubenswrapper[4792]: E0309 09:09:56.661679 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:56 crc kubenswrapper[4792]: E0309 09:09:56.661753 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:56 crc kubenswrapper[4792]: E0309 09:09:56.661804 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.065722 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.065834 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.065853 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.065900 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.065917 4792 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:09:58Z","lastTransitionTime":"2026-03-09T09:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.140024 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv"] Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.140919 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.144016 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.145062 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.145870 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.145923 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.217334 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/68148170-4b16-496b-863e-1e0079a9b58c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.217381 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/68148170-4b16-496b-863e-1e0079a9b58c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.217405 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68148170-4b16-496b-863e-1e0079a9b58c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.217445 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68148170-4b16-496b-863e-1e0079a9b58c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.217465 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68148170-4b16-496b-863e-1e0079a9b58c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.318290 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/68148170-4b16-496b-863e-1e0079a9b58c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.318333 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/68148170-4b16-496b-863e-1e0079a9b58c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.318357 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68148170-4b16-496b-863e-1e0079a9b58c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.318393 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68148170-4b16-496b-863e-1e0079a9b58c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.318401 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/68148170-4b16-496b-863e-1e0079a9b58c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.318413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68148170-4b16-496b-863e-1e0079a9b58c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.318528 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/68148170-4b16-496b-863e-1e0079a9b58c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.319715 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68148170-4b16-496b-863e-1e0079a9b58c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.327727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68148170-4b16-496b-863e-1e0079a9b58c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.340639 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68148170-4b16-496b-863e-1e0079a9b58c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8rchv\" (UID: \"68148170-4b16-496b-863e-1e0079a9b58c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.465249 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" Mar 09 09:09:58 crc kubenswrapper[4792]: W0309 09:09:58.488263 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68148170_4b16_496b_863e_1e0079a9b58c.slice/crio-bdf2e46dbea2961e2731fc8b97ec57a1ee60a7887f2cfc4bb3f5850472c5fa2a WatchSource:0}: Error finding container bdf2e46dbea2961e2731fc8b97ec57a1ee60a7887f2cfc4bb3f5850472c5fa2a: Status 404 returned error can't find the container with id bdf2e46dbea2961e2731fc8b97ec57a1ee60a7887f2cfc4bb3f5850472c5fa2a Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.661750 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.661785 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.661821 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.661826 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:09:58 crc kubenswrapper[4792]: E0309 09:09:58.661922 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:09:58 crc kubenswrapper[4792]: E0309 09:09:58.662036 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:09:58 crc kubenswrapper[4792]: E0309 09:09:58.662267 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:09:58 crc kubenswrapper[4792]: E0309 09:09:58.662476 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.682040 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 09 09:09:58 crc kubenswrapper[4792]: I0309 09:09:58.692421 4792 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 09:09:59 crc kubenswrapper[4792]: I0309 09:09:59.421725 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" event={"ID":"68148170-4b16-496b-863e-1e0079a9b58c","Type":"ContainerStarted","Data":"257982ccc5f3cd8bdf6c488b08d582dc9011edd6b7ce4a6da32024222c18aeab"} Mar 09 09:09:59 crc kubenswrapper[4792]: I0309 09:09:59.421791 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" event={"ID":"68148170-4b16-496b-863e-1e0079a9b58c","Type":"ContainerStarted","Data":"bdf2e46dbea2961e2731fc8b97ec57a1ee60a7887f2cfc4bb3f5850472c5fa2a"} Mar 09 09:09:59 crc kubenswrapper[4792]: I0309 09:09:59.447813 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8rchv" podStartSLOduration=121.447784326 podStartE2EDuration="2m1.447784326s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:59.446628184 +0000 UTC m=+164.476828936" watchObservedRunningTime="2026-03-09 09:09:59.447784326 +0000 UTC m=+164.477985068" Mar 09 09:10:00 crc kubenswrapper[4792]: I0309 09:10:00.662527 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:00 crc kubenswrapper[4792]: I0309 09:10:00.662531 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:00 crc kubenswrapper[4792]: E0309 09:10:00.663336 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:10:00 crc kubenswrapper[4792]: I0309 09:10:00.662582 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:00 crc kubenswrapper[4792]: I0309 09:10:00.662566 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:00 crc kubenswrapper[4792]: E0309 09:10:00.663465 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:10:00 crc kubenswrapper[4792]: E0309 09:10:00.663546 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:10:00 crc kubenswrapper[4792]: E0309 09:10:00.664046 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:10:00 crc kubenswrapper[4792]: E0309 09:10:00.789023 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:10:02 crc kubenswrapper[4792]: I0309 09:10:02.661786 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:02 crc kubenswrapper[4792]: I0309 09:10:02.661831 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:02 crc kubenswrapper[4792]: I0309 09:10:02.661787 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:02 crc kubenswrapper[4792]: E0309 09:10:02.661952 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:10:02 crc kubenswrapper[4792]: E0309 09:10:02.662141 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:10:02 crc kubenswrapper[4792]: I0309 09:10:02.662237 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:02 crc kubenswrapper[4792]: E0309 09:10:02.662420 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:10:02 crc kubenswrapper[4792]: E0309 09:10:02.662665 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:10:02 crc kubenswrapper[4792]: I0309 09:10:02.662815 4792 scope.go:117] "RemoveContainer" containerID="1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf" Mar 09 09:10:02 crc kubenswrapper[4792]: E0309 09:10:02.662965 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lfm2j_openshift-ovn-kubernetes(740550e5-d1a4-4f0c-8efd-1ccd8f9319e5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" Mar 09 09:10:04 crc kubenswrapper[4792]: I0309 09:10:04.661703 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:04 crc kubenswrapper[4792]: I0309 09:10:04.661874 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:04 crc kubenswrapper[4792]: E0309 09:10:04.661943 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:10:04 crc kubenswrapper[4792]: I0309 09:10:04.662028 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:04 crc kubenswrapper[4792]: E0309 09:10:04.662251 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:10:04 crc kubenswrapper[4792]: I0309 09:10:04.662376 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:04 crc kubenswrapper[4792]: E0309 09:10:04.662455 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:10:04 crc kubenswrapper[4792]: E0309 09:10:04.662571 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:10:05 crc kubenswrapper[4792]: E0309 09:10:05.790096 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:10:06 crc kubenswrapper[4792]: I0309 09:10:06.662010 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:06 crc kubenswrapper[4792]: I0309 09:10:06.662188 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:06 crc kubenswrapper[4792]: I0309 09:10:06.662184 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:06 crc kubenswrapper[4792]: I0309 09:10:06.662111 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:06 crc kubenswrapper[4792]: E0309 09:10:06.662378 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:10:06 crc kubenswrapper[4792]: E0309 09:10:06.662522 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:10:06 crc kubenswrapper[4792]: E0309 09:10:06.662701 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:10:06 crc kubenswrapper[4792]: E0309 09:10:06.662855 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:10:08 crc kubenswrapper[4792]: I0309 09:10:08.661398 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:08 crc kubenswrapper[4792]: I0309 09:10:08.661465 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:08 crc kubenswrapper[4792]: I0309 09:10:08.661500 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:08 crc kubenswrapper[4792]: I0309 09:10:08.661566 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:08 crc kubenswrapper[4792]: E0309 09:10:08.661600 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:10:08 crc kubenswrapper[4792]: E0309 09:10:08.661721 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:10:08 crc kubenswrapper[4792]: E0309 09:10:08.661851 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:10:08 crc kubenswrapper[4792]: E0309 09:10:08.661952 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:10:10 crc kubenswrapper[4792]: I0309 09:10:10.661450 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:10 crc kubenswrapper[4792]: I0309 09:10:10.661539 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:10 crc kubenswrapper[4792]: I0309 09:10:10.661543 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:10 crc kubenswrapper[4792]: E0309 09:10:10.661702 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:10:10 crc kubenswrapper[4792]: I0309 09:10:10.661782 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:10 crc kubenswrapper[4792]: E0309 09:10:10.661904 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:10:10 crc kubenswrapper[4792]: E0309 09:10:10.662005 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:10:10 crc kubenswrapper[4792]: E0309 09:10:10.662157 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:10:10 crc kubenswrapper[4792]: E0309 09:10:10.791910 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:10:12 crc kubenswrapper[4792]: I0309 09:10:12.662234 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:12 crc kubenswrapper[4792]: I0309 09:10:12.662294 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:12 crc kubenswrapper[4792]: I0309 09:10:12.662350 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:12 crc kubenswrapper[4792]: E0309 09:10:12.662509 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:10:12 crc kubenswrapper[4792]: E0309 09:10:12.662671 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:10:12 crc kubenswrapper[4792]: E0309 09:10:12.662917 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:10:12 crc kubenswrapper[4792]: I0309 09:10:12.662290 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:12 crc kubenswrapper[4792]: E0309 09:10:12.663395 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:10:14 crc kubenswrapper[4792]: I0309 09:10:14.662451 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:14 crc kubenswrapper[4792]: I0309 09:10:14.662489 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:14 crc kubenswrapper[4792]: I0309 09:10:14.662527 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:14 crc kubenswrapper[4792]: I0309 09:10:14.662496 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:14 crc kubenswrapper[4792]: E0309 09:10:14.662697 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:10:14 crc kubenswrapper[4792]: E0309 09:10:14.662837 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:10:14 crc kubenswrapper[4792]: E0309 09:10:14.662970 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:10:14 crc kubenswrapper[4792]: E0309 09:10:14.663154 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:10:15 crc kubenswrapper[4792]: I0309 09:10:15.665172 4792 scope.go:117] "RemoveContainer" containerID="1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf" Mar 09 09:10:15 crc kubenswrapper[4792]: E0309 09:10:15.792740 4792 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:10:16 crc kubenswrapper[4792]: I0309 09:10:16.488393 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/3.log" Mar 09 09:10:16 crc kubenswrapper[4792]: I0309 09:10:16.491724 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerStarted","Data":"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc"} Mar 09 09:10:16 crc kubenswrapper[4792]: I0309 09:10:16.492307 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:10:16 crc kubenswrapper[4792]: I0309 09:10:16.616351 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podStartSLOduration=138.616327798 podStartE2EDuration="2m18.616327798s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:16.528448835 +0000 UTC m=+181.558649587" watchObservedRunningTime="2026-03-09 09:10:16.616327798 +0000 UTC m=+181.646528550" Mar 09 09:10:16 crc kubenswrapper[4792]: I0309 09:10:16.617490 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fttpc"] Mar 09 09:10:16 crc kubenswrapper[4792]: I0309 09:10:16.617635 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:16 crc kubenswrapper[4792]: E0309 09:10:16.617765 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:10:16 crc kubenswrapper[4792]: I0309 09:10:16.662342 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:16 crc kubenswrapper[4792]: I0309 09:10:16.662367 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:16 crc kubenswrapper[4792]: I0309 09:10:16.662391 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:16 crc kubenswrapper[4792]: E0309 09:10:16.662500 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:10:16 crc kubenswrapper[4792]: E0309 09:10:16.662608 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:10:16 crc kubenswrapper[4792]: E0309 09:10:16.662758 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:10:18 crc kubenswrapper[4792]: I0309 09:10:18.661361 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:18 crc kubenswrapper[4792]: I0309 09:10:18.661458 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:18 crc kubenswrapper[4792]: I0309 09:10:18.661458 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:18 crc kubenswrapper[4792]: I0309 09:10:18.661470 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:18 crc kubenswrapper[4792]: E0309 09:10:18.661836 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:10:18 crc kubenswrapper[4792]: E0309 09:10:18.661877 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:10:18 crc kubenswrapper[4792]: E0309 09:10:18.661925 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:10:18 crc kubenswrapper[4792]: E0309 09:10:18.662041 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:10:20 crc kubenswrapper[4792]: I0309 09:10:20.662140 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:20 crc kubenswrapper[4792]: E0309 09:10:20.662762 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fttpc" podUID="4711cce5-88a9-48c4-8e2e-522062e34a03" Mar 09 09:10:20 crc kubenswrapper[4792]: I0309 09:10:20.662204 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:20 crc kubenswrapper[4792]: E0309 09:10:20.663000 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:10:20 crc kubenswrapper[4792]: I0309 09:10:20.662154 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:20 crc kubenswrapper[4792]: E0309 09:10:20.663193 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:10:20 crc kubenswrapper[4792]: I0309 09:10:20.662278 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:20 crc kubenswrapper[4792]: E0309 09:10:20.663367 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:10:22 crc kubenswrapper[4792]: I0309 09:10:22.661379 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:22 crc kubenswrapper[4792]: I0309 09:10:22.661463 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:22 crc kubenswrapper[4792]: I0309 09:10:22.661480 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:22 crc kubenswrapper[4792]: I0309 09:10:22.661508 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:22 crc kubenswrapper[4792]: I0309 09:10:22.665941 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 09:10:22 crc kubenswrapper[4792]: I0309 09:10:22.666276 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 09:10:22 crc kubenswrapper[4792]: I0309 09:10:22.666428 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 09:10:22 crc kubenswrapper[4792]: I0309 09:10:22.666583 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 09:10:22 crc kubenswrapper[4792]: I0309 09:10:22.666440 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 09:10:22 crc kubenswrapper[4792]: I0309 09:10:22.666671 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.913532 4792 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.961268 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-l8jxj"] Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.962021 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.962045 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d"] Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.962763 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx"] Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.962802 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.963411 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.963532 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p5kgd"] Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.964255 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-p5kgd" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.966701 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xllzt"] Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.967099 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.969573 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-56b7z"] Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.970213 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.971567 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ncpc5"] Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.972028 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.972406 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dgsw8"] Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.972686 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.973520 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tl5jf"] Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.974308 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.974443 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.974554 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h"] Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.975015 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.979910 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7"] Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.980539 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.984011 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.985510 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.985907 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.985907 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.986058 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.986220 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.986248 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.986398 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.986489 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.986568 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.986639 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 09:10:28 crc kubenswrapper[4792]: I0309 09:10:28.989474 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.026358 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.026469 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.030390 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.056167 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.056250 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.056372 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.056394 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-9ch2w"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.056965 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9ch2w" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.056417 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.057083 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.056563 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.056661 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.057270 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.057287 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.056845 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.057387 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.057428 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.056855 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.057474 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.056932 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.057005 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.058626 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.058785 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.058906 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.059105 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.059232 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.059754 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.059878 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.060021 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.060182 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.060325 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.060505 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.060647 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.061142 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.061400 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.061539 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.061690 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.061818 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 09:10:29 crc kubenswrapper[4792]: W0309 09:10:29.061948 4792 reflector.go:561] object-"openshift-oauth-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 09 09:10:29 crc kubenswrapper[4792]: E0309 09:10:29.062009 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062095 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062111 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062216 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062231 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062297 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062344 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062407 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062491 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062520 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062610 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062667 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062720 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062810 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062871 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062976 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.063128 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.063144 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rl7tp"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.063249 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.063369 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.062813 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.063822 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.066229 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.069883 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.070446 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.075140 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-49k84"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.075859 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.078376 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-jh5pl"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.078872 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.079218 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.079552 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.092152 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.092865 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.093003 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.093604 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nz8fs"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.094370 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nz8fs" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.094598 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.096875 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.097337 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.097666 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.126831 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.127027 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.128819 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-kcnhk"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.130973 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.131143 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.131564 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.131648 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.131738 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.158278 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.160555 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.168994 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.181258 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.181934 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.182320 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.182937 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.183041 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-serving-cert\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.183429 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7c5109b5-4292-437f-8524-bb7d35147a71-encryption-config\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.183549 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-serving-cert\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.183589 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/530ed995-d9d1-4aff-93b9-9b2a35194cd2-etcd-service-ca\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.183708 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-audit-dir\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.183724 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.183821 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-trusted-ca-bundle\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.183849 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.183874 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fc2b2079-7189-4ca3-b398-2a1146b9c70f-images\") pod \"machine-api-operator-5694c8668f-56b7z\" (UID: \"fc2b2079-7189-4ca3-b398-2a1146b9c70f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.184010 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.184121 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.184644 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.184788 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186010 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.185353 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf5jw\" (UniqueName: \"kubernetes.io/projected/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-kube-api-access-nf5jw\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186394 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5tppx\" (UID: \"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186416 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5tppx\" (UID: \"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186437 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4a19004-24a2-4825-bca2-24d98c3d69cc-trusted-ca\") pod \"console-operator-58897d9998-xllzt\" (UID: \"f4a19004-24a2-4825-bca2-24d98c3d69cc\") " pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186476 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-etcd-serving-ca\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186497 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186516 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/530ed995-d9d1-4aff-93b9-9b2a35194cd2-etcd-ca\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186536 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-image-import-ca\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186552 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/adf7cc5f-5027-4382-bac8-ed4f459fe424-audit-dir\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186574 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj29r\" (UniqueName: \"kubernetes.io/projected/35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8-kube-api-access-xj29r\") pod \"cluster-image-registry-operator-dc59b4c8b-5tppx\" (UID: \"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186592 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7c5109b5-4292-437f-8524-bb7d35147a71-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186609 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-encryption-config\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186626 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186659 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvkrg\" (UniqueName: \"kubernetes.io/projected/adf7cc5f-5027-4382-bac8-ed4f459fe424-kube-api-access-wvkrg\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186675 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/530ed995-d9d1-4aff-93b9-9b2a35194cd2-serving-cert\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186697 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186714 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186732 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-node-pullsecrets\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186752 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-audit-policies\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186770 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj5dk\" (UniqueName: \"kubernetes.io/projected/e4caea97-6238-4185-9e61-5a40aa699205-kube-api-access-pj5dk\") pod \"cluster-samples-operator-665b6dd947-tw44d\" (UID: \"e4caea97-6238-4185-9e61-5a40aa699205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186831 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g45ns\" (UniqueName: \"kubernetes.io/projected/e37d381c-62ae-489d-888d-320a3f7959cc-kube-api-access-g45ns\") pod \"dns-operator-744455d44c-p5kgd\" (UID: \"e37d381c-62ae-489d-888d-320a3f7959cc\") " pod="openshift-dns-operator/dns-operator-744455d44c-p5kgd" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186857 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc2b2079-7189-4ca3-b398-2a1146b9c70f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-56b7z\" (UID: \"fc2b2079-7189-4ca3-b398-2a1146b9c70f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186872 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c5109b5-4292-437f-8524-bb7d35147a71-audit-policies\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186889 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186909 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186928 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/560a23b3-6492-45d0-b273-df4e85bd9787-auth-proxy-config\") pod \"machine-approver-56656f9798-8nr7h\" (UID: \"560a23b3-6492-45d0-b273-df4e85bd9787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186943 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7c5109b5-4292-437f-8524-bb7d35147a71-etcd-client\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186957 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn4bc\" (UniqueName: \"kubernetes.io/projected/f4a19004-24a2-4825-bca2-24d98c3d69cc-kube-api-access-sn4bc\") pod \"console-operator-58897d9998-xllzt\" (UID: \"f4a19004-24a2-4825-bca2-24d98c3d69cc\") " pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.186977 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh6wx\" (UniqueName: \"kubernetes.io/projected/530ed995-d9d1-4aff-93b9-9b2a35194cd2-kube-api-access-mh6wx\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187006 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187022 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187060 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c5109b5-4292-437f-8524-bb7d35147a71-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187090 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c5109b5-4292-437f-8524-bb7d35147a71-audit-dir\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187106 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/560a23b3-6492-45d0-b273-df4e85bd9787-machine-approver-tls\") pod \"machine-approver-56656f9798-8nr7h\" (UID: \"560a23b3-6492-45d0-b273-df4e85bd9787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187120 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6hf6\" (UniqueName: \"kubernetes.io/projected/560a23b3-6492-45d0-b273-df4e85bd9787-kube-api-access-g6hf6\") pod \"machine-approver-56656f9798-8nr7h\" (UID: \"560a23b3-6492-45d0-b273-df4e85bd9787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187135 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-config\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187154 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkrdg\" (UniqueName: \"kubernetes.io/projected/fc2b2079-7189-4ca3-b398-2a1146b9c70f-kube-api-access-zkrdg\") pod \"machine-api-operator-5694c8668f-56b7z\" (UID: \"fc2b2079-7189-4ca3-b398-2a1146b9c70f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187170 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e37d381c-62ae-489d-888d-320a3f7959cc-metrics-tls\") pod \"dns-operator-744455d44c-p5kgd\" (UID: \"e37d381c-62ae-489d-888d-320a3f7959cc\") " pod="openshift-dns-operator/dns-operator-744455d44c-p5kgd" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187196 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-config\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187212 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrsxh\" (UniqueName: \"kubernetes.io/projected/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-kube-api-access-vrsxh\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187231 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-etcd-client\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.185661 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187249 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc2b2079-7189-4ca3-b398-2a1146b9c70f-config\") pod \"machine-api-operator-5694c8668f-56b7z\" (UID: \"fc2b2079-7189-4ca3-b398-2a1146b9c70f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187264 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4a19004-24a2-4825-bca2-24d98c3d69cc-serving-cert\") pod \"console-operator-58897d9998-xllzt\" (UID: \"f4a19004-24a2-4825-bca2-24d98c3d69cc\") " pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187280 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/530ed995-d9d1-4aff-93b9-9b2a35194cd2-config\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187298 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-audit\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187315 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.185702 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187332 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzcrq\" (UniqueName: \"kubernetes.io/projected/7c5109b5-4292-437f-8524-bb7d35147a71-kube-api-access-zzcrq\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187348 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/560a23b3-6492-45d0-b273-df4e85bd9787-config\") pod \"machine-approver-56656f9798-8nr7h\" (UID: \"560a23b3-6492-45d0-b273-df4e85bd9787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187370 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4caea97-6238-4185-9e61-5a40aa699205-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tw44d\" (UID: \"e4caea97-6238-4185-9e61-5a40aa699205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187417 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/530ed995-d9d1-4aff-93b9-9b2a35194cd2-etcd-client\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187436 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187455 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187474 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c5109b5-4292-437f-8524-bb7d35147a71-serving-cert\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187492 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a19004-24a2-4825-bca2-24d98c3d69cc-config\") pod \"console-operator-58897d9998-xllzt\" (UID: \"f4a19004-24a2-4825-bca2-24d98c3d69cc\") " pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187508 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-client-ca\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.187533 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5tppx\" (UID: \"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.191242 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.191771 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.192046 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgrs4"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.192407 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.192690 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.192850 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.202666 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.203528 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.203721 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.203911 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.203728 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.203911 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.204446 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.205658 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.206703 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.208008 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.211404 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.212016 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.213792 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.220093 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.220675 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7p469"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.220805 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.221002 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.221129 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.221339 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.221445 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.221733 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.221866 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.222194 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kkrgv"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.222505 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.222652 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.222817 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m7bwm"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.222921 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.223259 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.223645 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.223878 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.223940 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.223673 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.224962 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.225397 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.225656 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-m7bwm" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.225883 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.228750 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.229328 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.232174 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-l8jxj"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.232599 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.233469 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.234841 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tl5jf"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.237124 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p5kgd"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.237829 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7lxp2"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.238521 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.239925 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.241326 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.244497 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.258324 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550780-48757"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.260395 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.263256 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550790-q6wbp"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.269461 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.269736 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550790-q6wbp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.270499 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.279437 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-56b7z"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.289573 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293448 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e37d381c-62ae-489d-888d-320a3f7959cc-metrics-tls\") pod \"dns-operator-744455d44c-p5kgd\" (UID: \"e37d381c-62ae-489d-888d-320a3f7959cc\") " pod="openshift-dns-operator/dns-operator-744455d44c-p5kgd" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293483 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrsxh\" (UniqueName: \"kubernetes.io/projected/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-kube-api-access-vrsxh\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293506 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc2b2079-7189-4ca3-b398-2a1146b9c70f-config\") pod \"machine-api-operator-5694c8668f-56b7z\" (UID: \"fc2b2079-7189-4ca3-b398-2a1146b9c70f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293526 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4a19004-24a2-4825-bca2-24d98c3d69cc-serving-cert\") pod \"console-operator-58897d9998-xllzt\" (UID: \"f4a19004-24a2-4825-bca2-24d98c3d69cc\") " pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293551 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/894f7c69-0119-4c19-b205-9780fb52b06e-console-oauth-config\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293572 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2csp\" (UniqueName: \"kubernetes.io/projected/83d49252-7752-48bc-86b6-c604984cd533-kube-api-access-h2csp\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293591 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c-images\") pod \"machine-config-operator-74547568cd-7p469\" (UID: \"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293615 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/432c0903-faf2-4367-b378-730898f2dfcc-config\") pod \"kube-controller-manager-operator-78b949d7b-m9545\" (UID: \"432c0903-faf2-4367-b378-730898f2dfcc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293637 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293659 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzcrq\" (UniqueName: \"kubernetes.io/projected/7c5109b5-4292-437f-8524-bb7d35147a71-kube-api-access-zzcrq\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/530ed995-d9d1-4aff-93b9-9b2a35194cd2-etcd-client\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293703 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5fb0f27-68dd-4280-84f5-b46851d6ab96-service-ca-bundle\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293724 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293744 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c5109b5-4292-437f-8524-bb7d35147a71-serving-cert\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/16a87e53-5015-4e1b-bcd9-7dd81a4f6456-srv-cert\") pod \"olm-operator-6b444d44fb-ltjn9\" (UID: \"16a87e53-5015-4e1b-bcd9-7dd81a4f6456\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293785 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4qkv\" (UniqueName: \"kubernetes.io/projected/f5fb0f27-68dd-4280-84f5-b46851d6ab96-kube-api-access-m4qkv\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293806 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74fbc42d-490e-49dc-9a95-cff2e3b6d1f6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dpzj6\" (UID: \"74fbc42d-490e-49dc-9a95-cff2e3b6d1f6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293828 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-audit-dir\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.293847 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-serving-cert\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294033 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-config\") pod \"service-ca-operator-777779d784-bcj7g\" (UID: \"d2d7cc79-bb68-4db2-9e5a-edd00436b08e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294054 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-trusted-ca-bundle\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294057 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-audit-dir\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294093 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fc2b2079-7189-4ca3-b398-2a1146b9c70f-images\") pod \"machine-api-operator-5694c8668f-56b7z\" (UID: \"fc2b2079-7189-4ca3-b398-2a1146b9c70f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294113 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf5jw\" (UniqueName: \"kubernetes.io/projected/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-kube-api-access-nf5jw\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294141 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5tppx\" (UID: \"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294163 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7p6q\" (UniqueName: \"kubernetes.io/projected/f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b-kube-api-access-d7p6q\") pod \"downloads-7954f5f757-9ch2w\" (UID: \"f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b\") " pod="openshift-console/downloads-7954f5f757-9ch2w" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.292992 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9ch2w"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294241 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-etcd-serving-ca\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294262 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294281 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/530ed995-d9d1-4aff-93b9-9b2a35194cd2-etcd-ca\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294302 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/43baf1d3-85f7-4ce5-9650-260f0803cdab-signing-cabundle\") pod \"service-ca-9c57cc56f-7lxp2\" (UID: \"43baf1d3-85f7-4ce5-9650-260f0803cdab\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294320 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-oauth-serving-cert\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294338 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5sx2\" (UniqueName: \"kubernetes.io/projected/d3c17466-62af-4a4b-bc43-a4eda5d974dd-kube-api-access-g5sx2\") pod \"openshift-config-operator-7777fb866f-49k84\" (UID: \"d3c17466-62af-4a4b-bc43-a4eda5d974dd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.294356 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/83d49252-7752-48bc-86b6-c604984cd533-default-certificate\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297375 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/530ed995-d9d1-4aff-93b9-9b2a35194cd2-etcd-ca\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297505 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83d49252-7752-48bc-86b6-c604984cd533-service-ca-bundle\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297540 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c697ae9b-78ed-4463-9a32-6e6bd1593d70-metrics-tls\") pod \"ingress-operator-5b745b69d9-2v6nv\" (UID: \"c697ae9b-78ed-4463-9a32-6e6bd1593d70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297570 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-audit-policies\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297594 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj5dk\" (UniqueName: \"kubernetes.io/projected/e4caea97-6238-4185-9e61-5a40aa699205-kube-api-access-pj5dk\") pod \"cluster-samples-operator-665b6dd947-tw44d\" (UID: \"e4caea97-6238-4185-9e61-5a40aa699205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297656 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc2b2079-7189-4ca3-b398-2a1146b9c70f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-56b7z\" (UID: \"fc2b2079-7189-4ca3-b398-2a1146b9c70f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297684 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2012f11e-ed51-4453-8f96-47269823dded-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-znzvh\" (UID: \"2012f11e-ed51-4453-8f96-47269823dded\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297709 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297733 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea625b31-78ac-4c2b-8f73-3e5c74894fce-serving-cert\") pod \"route-controller-manager-6576b87f9c-7f29j\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297751 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k797x\" (UniqueName: \"kubernetes.io/projected/2012f11e-ed51-4453-8f96-47269823dded-kube-api-access-k797x\") pod \"openshift-controller-manager-operator-756b6f6bc6-znzvh\" (UID: \"2012f11e-ed51-4453-8f96-47269823dded\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297778 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krplw\" (UniqueName: \"kubernetes.io/projected/894f7c69-0119-4c19-b205-9780fb52b06e-kube-api-access-krplw\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297802 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aadae5cd-e840-4618-b021-d8ca0e9169bd-config-volume\") pod \"collect-profiles-29550780-48757\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297958 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.297988 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zv9nb"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.298769 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zv9nb" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.299353 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-audit-policies\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.301560 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7c5109b5-4292-437f-8524-bb7d35147a71-etcd-client\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.301593 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn4bc\" (UniqueName: \"kubernetes.io/projected/f4a19004-24a2-4825-bca2-24d98c3d69cc-kube-api-access-sn4bc\") pod \"console-operator-58897d9998-xllzt\" (UID: \"f4a19004-24a2-4825-bca2-24d98c3d69cc\") " pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.301618 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74fbc42d-490e-49dc-9a95-cff2e3b6d1f6-config\") pod \"kube-apiserver-operator-766d6c64bb-dpzj6\" (UID: \"74fbc42d-490e-49dc-9a95-cff2e3b6d1f6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.301641 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4ftr\" (UniqueName: \"kubernetes.io/projected/5e17c213-e2c1-4625-8487-2bdb28b0224d-kube-api-access-p4ftr\") pod \"multus-admission-controller-857f4d67dd-m7bwm\" (UID: \"5e17c213-e2c1-4625-8487-2bdb28b0224d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m7bwm" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.302134 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc2b2079-7189-4ca3-b398-2a1146b9c70f-config\") pod \"machine-api-operator-5694c8668f-56b7z\" (UID: \"fc2b2079-7189-4ca3-b398-2a1146b9c70f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.302259 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ncpc5"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.302298 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rl7tp"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.302747 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fc2b2079-7189-4ca3-b398-2a1146b9c70f-images\") pod \"machine-api-operator-5694c8668f-56b7z\" (UID: \"fc2b2079-7189-4ca3-b398-2a1146b9c70f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.304905 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d3c17466-62af-4a4b-bc43-a4eda5d974dd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-49k84\" (UID: \"d3c17466-62af-4a4b-bc43-a4eda5d974dd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.304937 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/43baf1d3-85f7-4ce5-9650-260f0803cdab-signing-key\") pod \"service-ca-9c57cc56f-7lxp2\" (UID: \"43baf1d3-85f7-4ce5-9650-260f0803cdab\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.304978 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.304999 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c697ae9b-78ed-4463-9a32-6e6bd1593d70-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2v6nv\" (UID: \"c697ae9b-78ed-4463-9a32-6e6bd1593d70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305022 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74fbc42d-490e-49dc-9a95-cff2e3b6d1f6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dpzj6\" (UID: \"74fbc42d-490e-49dc-9a95-cff2e3b6d1f6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305093 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkrdg\" (UniqueName: \"kubernetes.io/projected/fc2b2079-7189-4ca3-b398-2a1146b9c70f-kube-api-access-zkrdg\") pod \"machine-api-operator-5694c8668f-56b7z\" (UID: \"fc2b2079-7189-4ca3-b398-2a1146b9c70f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305116 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c5109b5-4292-437f-8524-bb7d35147a71-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305136 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-config\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305158 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/83d49252-7752-48bc-86b6-c604984cd533-stats-auth\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305182 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-trusted-ca-bundle\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305181 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-5tppx\" (UID: \"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305220 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aadae5cd-e840-4618-b021-d8ca0e9169bd-secret-volume\") pod \"collect-profiles-29550780-48757\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305243 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-config\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305266 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flhdw\" (UniqueName: \"kubernetes.io/projected/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-kube-api-access-flhdw\") pod \"service-ca-operator-777779d784-bcj7g\" (UID: \"d2d7cc79-bb68-4db2-9e5a-edd00436b08e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305289 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-etcd-client\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305310 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/530ed995-d9d1-4aff-93b9-9b2a35194cd2-config\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305332 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5fb0f27-68dd-4280-84f5-b46851d6ab96-config\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305400 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnvxt\" (UniqueName: \"kubernetes.io/projected/c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c-kube-api-access-vnvxt\") pod \"machine-config-operator-74547568cd-7p469\" (UID: \"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305419 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/432c0903-faf2-4367-b378-730898f2dfcc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m9545\" (UID: \"432c0903-faf2-4367-b378-730898f2dfcc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305440 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-service-ca\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305460 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-audit\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305482 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/560a23b3-6492-45d0-b273-df4e85bd9787-config\") pod \"machine-approver-56656f9798-8nr7h\" (UID: \"560a23b3-6492-45d0-b273-df4e85bd9787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305502 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4caea97-6238-4185-9e61-5a40aa699205-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tw44d\" (UID: \"e4caea97-6238-4185-9e61-5a40aa699205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305523 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2012f11e-ed51-4453-8f96-47269823dded-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-znzvh\" (UID: \"2012f11e-ed51-4453-8f96-47269823dded\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305543 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7p469\" (UID: \"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305564 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea625b31-78ac-4c2b-8f73-3e5c74894fce-config\") pod \"route-controller-manager-6576b87f9c-7f29j\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305585 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/894f7c69-0119-4c19-b205-9780fb52b06e-console-serving-cert\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305607 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-trusted-ca-bundle\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305706 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5tppx\" (UID: \"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305729 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305749 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a19004-24a2-4825-bca2-24d98c3d69cc-config\") pod \"console-operator-58897d9998-xllzt\" (UID: \"f4a19004-24a2-4825-bca2-24d98c3d69cc\") " pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-client-ca\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305788 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-serving-cert\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305821 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83d49252-7752-48bc-86b6-c604984cd533-metrics-certs\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305843 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7c5109b5-4292-437f-8524-bb7d35147a71-encryption-config\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305862 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/530ed995-d9d1-4aff-93b9-9b2a35194cd2-etcd-service-ca\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305882 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5fb0f27-68dd-4280-84f5-b46851d6ab96-serving-cert\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305905 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305935 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-serving-cert\") pod \"service-ca-operator-777779d784-bcj7g\" (UID: \"d2d7cc79-bb68-4db2-9e5a-edd00436b08e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305956 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5tppx\" (UID: \"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305974 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4a19004-24a2-4825-bca2-24d98c3d69cc-trusted-ca\") pod \"console-operator-58897d9998-xllzt\" (UID: \"f4a19004-24a2-4825-bca2-24d98c3d69cc\") " pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.305996 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea625b31-78ac-4c2b-8f73-3e5c74894fce-client-ca\") pod \"route-controller-manager-6576b87f9c-7f29j\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.306016 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd8kq\" (UniqueName: \"kubernetes.io/projected/43baf1d3-85f7-4ce5-9650-260f0803cdab-kube-api-access-xd8kq\") pod \"service-ca-9c57cc56f-7lxp2\" (UID: \"43baf1d3-85f7-4ce5-9650-260f0803cdab\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.306037 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k66nz\" (UniqueName: \"kubernetes.io/projected/ea625b31-78ac-4c2b-8f73-3e5c74894fce-kube-api-access-k66nz\") pod \"route-controller-manager-6576b87f9c-7f29j\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.306058 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn88s\" (UniqueName: \"kubernetes.io/projected/d3934c8c-f197-4ef6-ac5c-76560a192e50-kube-api-access-mn88s\") pod \"auto-csr-approver-29550790-q6wbp\" (UID: \"d3934c8c-f197-4ef6-ac5c-76560a192e50\") " pod="openshift-infra/auto-csr-approver-29550790-q6wbp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.306282 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-config\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.306354 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-g7rwm"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.306374 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.306701 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-config\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.307269 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-49k84"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.307406 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g7rwm" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.307971 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a19004-24a2-4825-bca2-24d98c3d69cc-config\") pod \"console-operator-58897d9998-xllzt\" (UID: \"f4a19004-24a2-4825-bca2-24d98c3d69cc\") " pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.308635 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7c5109b5-4292-437f-8524-bb7d35147a71-etcd-client\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.308698 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.308759 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.309133 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-serving-cert\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.309925 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.320093 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-client-ca\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.311081 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-etcd-serving-ca\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.313045 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/530ed995-d9d1-4aff-93b9-9b2a35194cd2-config\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.313220 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4a19004-24a2-4825-bca2-24d98c3d69cc-serving-cert\") pod \"console-operator-58897d9998-xllzt\" (UID: \"f4a19004-24a2-4825-bca2-24d98c3d69cc\") " pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.313612 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-audit\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.313729 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.314024 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/530ed995-d9d1-4aff-93b9-9b2a35194cd2-etcd-client\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.314039 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/560a23b3-6492-45d0-b273-df4e85bd9787-config\") pod \"machine-approver-56656f9798-8nr7h\" (UID: \"560a23b3-6492-45d0-b273-df4e85bd9787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.314271 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.314789 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/16a87e53-5015-4e1b-bcd9-7dd81a4f6456-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ltjn9\" (UID: \"16a87e53-5015-4e1b-bcd9-7dd81a4f6456\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.315127 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-serving-cert\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.316003 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.316698 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/530ed995-d9d1-4aff-93b9-9b2a35194cd2-etcd-service-ca\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.316977 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.317489 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4a19004-24a2-4825-bca2-24d98c3d69cc-trusted-ca\") pod \"console-operator-58897d9998-xllzt\" (UID: \"f4a19004-24a2-4825-bca2-24d98c3d69cc\") " pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.317962 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.318471 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7c5109b5-4292-437f-8524-bb7d35147a71-encryption-config\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.318841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e37d381c-62ae-489d-888d-320a3f7959cc-metrics-tls\") pod \"dns-operator-744455d44c-p5kgd\" (UID: \"e37d381c-62ae-489d-888d-320a3f7959cc\") " pod="openshift-dns-operator/dns-operator-744455d44c-p5kgd" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.318982 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4caea97-6238-4185-9e61-5a40aa699205-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tw44d\" (UID: \"e4caea97-6238-4185-9e61-5a40aa699205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.320279 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.325680 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nz8fs"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.325822 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dgsw8"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.325991 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7lxp2"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.310762 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c5109b5-4292-437f-8524-bb7d35147a71-serving-cert\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.321411 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-5tppx\" (UID: \"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.323815 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj29r\" (UniqueName: \"kubernetes.io/projected/35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8-kube-api-access-xj29r\") pod \"cluster-image-registry-operator-dc59b4c8b-5tppx\" (UID: \"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.326427 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-etcd-client\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.326535 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-image-import-ca\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.326723 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.326840 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/adf7cc5f-5027-4382-bac8-ed4f459fe424-audit-dir\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.326727 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/adf7cc5f-5027-4382-bac8-ed4f459fe424-audit-dir\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.327391 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/432c0903-faf2-4367-b378-730898f2dfcc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m9545\" (UID: \"432c0903-faf2-4367-b378-730898f2dfcc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.327589 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7c5109b5-4292-437f-8524-bb7d35147a71-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.327687 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-encryption-config\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.327864 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.330311 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-console-config\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.328653 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7c5109b5-4292-437f-8524-bb7d35147a71-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.330793 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slv6j\" (UniqueName: \"kubernetes.io/projected/aadae5cd-e840-4618-b021-d8ca0e9169bd-kube-api-access-slv6j\") pod \"collect-profiles-29550780-48757\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.329901 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-image-import-ca\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.331532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvkrg\" (UniqueName: \"kubernetes.io/projected/adf7cc5f-5027-4382-bac8-ed4f459fe424-kube-api-access-wvkrg\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.331764 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/530ed995-d9d1-4aff-93b9-9b2a35194cd2-serving-cert\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.331903 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332158 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332377 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-node-pullsecrets\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332641 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g45ns\" (UniqueName: \"kubernetes.io/projected/e37d381c-62ae-489d-888d-320a3f7959cc-kube-api-access-g45ns\") pod \"dns-operator-744455d44c-p5kgd\" (UID: \"e37d381c-62ae-489d-888d-320a3f7959cc\") " pod="openshift-dns-operator/dns-operator-744455d44c-p5kgd" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332673 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c697ae9b-78ed-4463-9a32-6e6bd1593d70-trusted-ca\") pod \"ingress-operator-5b745b69d9-2v6nv\" (UID: \"c697ae9b-78ed-4463-9a32-6e6bd1593d70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332697 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e17c213-e2c1-4625-8487-2bdb28b0224d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m7bwm\" (UID: \"5e17c213-e2c1-4625-8487-2bdb28b0224d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m7bwm" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332726 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c5109b5-4292-437f-8524-bb7d35147a71-audit-policies\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332752 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8j8z\" (UniqueName: \"kubernetes.io/projected/16a87e53-5015-4e1b-bcd9-7dd81a4f6456-kube-api-access-c8j8z\") pod \"olm-operator-6b444d44fb-ltjn9\" (UID: \"16a87e53-5015-4e1b-bcd9-7dd81a4f6456\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332781 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332812 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5fb0f27-68dd-4280-84f5-b46851d6ab96-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332837 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c-proxy-tls\") pod \"machine-config-operator-74547568cd-7p469\" (UID: \"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332859 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/560a23b3-6492-45d0-b273-df4e85bd9787-auth-proxy-config\") pod \"machine-approver-56656f9798-8nr7h\" (UID: \"560a23b3-6492-45d0-b273-df4e85bd9787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332881 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brxgd\" (UniqueName: \"kubernetes.io/projected/824d014f-b04c-4304-8911-091172950873-kube-api-access-brxgd\") pod \"migrator-59844c95c7-nz8fs\" (UID: \"824d014f-b04c-4304-8911-091172950873\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nz8fs" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332905 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh6wx\" (UniqueName: \"kubernetes.io/projected/530ed995-d9d1-4aff-93b9-9b2a35194cd2-kube-api-access-mh6wx\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332928 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5g9c\" (UniqueName: \"kubernetes.io/projected/c697ae9b-78ed-4463-9a32-6e6bd1593d70-kube-api-access-j5g9c\") pod \"ingress-operator-5b745b69d9-2v6nv\" (UID: \"c697ae9b-78ed-4463-9a32-6e6bd1593d70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332951 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3c17466-62af-4a4b-bc43-a4eda5d974dd-serving-cert\") pod \"openshift-config-operator-7777fb866f-49k84\" (UID: \"d3c17466-62af-4a4b-bc43-a4eda5d974dd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332972 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332993 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c5109b5-4292-437f-8524-bb7d35147a71-audit-dir\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.333016 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/560a23b3-6492-45d0-b273-df4e85bd9787-machine-approver-tls\") pod \"machine-approver-56656f9798-8nr7h\" (UID: \"560a23b3-6492-45d0-b273-df4e85bd9787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.333036 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6hf6\" (UniqueName: \"kubernetes.io/projected/560a23b3-6492-45d0-b273-df4e85bd9787-kube-api-access-g6hf6\") pod \"machine-approver-56656f9798-8nr7h\" (UID: \"560a23b3-6492-45d0-b273-df4e85bd9787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.332607 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-node-pullsecrets\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.333711 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/560a23b3-6492-45d0-b273-df4e85bd9787-auth-proxy-config\") pod \"machine-approver-56656f9798-8nr7h\" (UID: \"560a23b3-6492-45d0-b273-df4e85bd9787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.333804 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7c5109b5-4292-437f-8524-bb7d35147a71-audit-policies\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.333823 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.333868 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7c5109b5-4292-437f-8524-bb7d35147a71-audit-dir\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.334231 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.334387 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.339178 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.339210 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.339697 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-encryption-config\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.341454 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.341782 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.342169 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/530ed995-d9d1-4aff-93b9-9b2a35194cd2-serving-cert\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.342544 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/560a23b3-6492-45d0-b273-df4e85bd9787-machine-approver-tls\") pod \"machine-approver-56656f9798-8nr7h\" (UID: \"560a23b3-6492-45d0-b273-df4e85bd9787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.342173 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.343762 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.345606 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.346296 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xllzt"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.347471 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7p469"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.350275 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4pzr2"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.351372 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jh5pl"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.351396 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.351510 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4pzr2" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.351668 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc2b2079-7189-4ca3-b398-2a1146b9c70f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-56b7z\" (UID: \"fc2b2079-7189-4ca3-b398-2a1146b9c70f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.354511 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-8kksk"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.356619 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.356805 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.357247 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.357927 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.359268 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.360315 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.361417 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kkrgv"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.362942 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-8kksk"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.364177 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zv9nb"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.365647 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgrs4"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.367062 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550780-48757"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.368497 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m7bwm"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.369574 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.370457 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.370896 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550790-q6wbp"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.372014 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.373516 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.376304 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4pzr2"] Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.384474 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.405122 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.425422 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.433950 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e17c213-e2c1-4625-8487-2bdb28b0224d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m7bwm\" (UID: \"5e17c213-e2c1-4625-8487-2bdb28b0224d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m7bwm" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434017 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c697ae9b-78ed-4463-9a32-6e6bd1593d70-trusted-ca\") pod \"ingress-operator-5b745b69d9-2v6nv\" (UID: \"c697ae9b-78ed-4463-9a32-6e6bd1593d70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434043 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8j8z\" (UniqueName: \"kubernetes.io/projected/16a87e53-5015-4e1b-bcd9-7dd81a4f6456-kube-api-access-c8j8z\") pod \"olm-operator-6b444d44fb-ltjn9\" (UID: \"16a87e53-5015-4e1b-bcd9-7dd81a4f6456\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434084 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5fb0f27-68dd-4280-84f5-b46851d6ab96-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434116 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c-proxy-tls\") pod \"machine-config-operator-74547568cd-7p469\" (UID: \"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434153 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brxgd\" (UniqueName: \"kubernetes.io/projected/824d014f-b04c-4304-8911-091172950873-kube-api-access-brxgd\") pod \"migrator-59844c95c7-nz8fs\" (UID: \"824d014f-b04c-4304-8911-091172950873\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nz8fs" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434185 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5g9c\" (UniqueName: \"kubernetes.io/projected/c697ae9b-78ed-4463-9a32-6e6bd1593d70-kube-api-access-j5g9c\") pod \"ingress-operator-5b745b69d9-2v6nv\" (UID: \"c697ae9b-78ed-4463-9a32-6e6bd1593d70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434212 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3c17466-62af-4a4b-bc43-a4eda5d974dd-serving-cert\") pod \"openshift-config-operator-7777fb866f-49k84\" (UID: \"d3c17466-62af-4a4b-bc43-a4eda5d974dd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434271 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/894f7c69-0119-4c19-b205-9780fb52b06e-console-oauth-config\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434302 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/432c0903-faf2-4367-b378-730898f2dfcc-config\") pod \"kube-controller-manager-operator-78b949d7b-m9545\" (UID: \"432c0903-faf2-4367-b378-730898f2dfcc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434331 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2csp\" (UniqueName: \"kubernetes.io/projected/83d49252-7752-48bc-86b6-c604984cd533-kube-api-access-h2csp\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434359 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c-images\") pod \"machine-config-operator-74547568cd-7p469\" (UID: \"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434390 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5fb0f27-68dd-4280-84f5-b46851d6ab96-service-ca-bundle\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434412 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/16a87e53-5015-4e1b-bcd9-7dd81a4f6456-srv-cert\") pod \"olm-operator-6b444d44fb-ltjn9\" (UID: \"16a87e53-5015-4e1b-bcd9-7dd81a4f6456\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434444 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4qkv\" (UniqueName: \"kubernetes.io/projected/f5fb0f27-68dd-4280-84f5-b46851d6ab96-kube-api-access-m4qkv\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434471 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74fbc42d-490e-49dc-9a95-cff2e3b6d1f6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dpzj6\" (UID: \"74fbc42d-490e-49dc-9a95-cff2e3b6d1f6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-config\") pod \"service-ca-operator-777779d784-bcj7g\" (UID: \"d2d7cc79-bb68-4db2-9e5a-edd00436b08e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434586 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7p6q\" (UniqueName: \"kubernetes.io/projected/f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b-kube-api-access-d7p6q\") pod \"downloads-7954f5f757-9ch2w\" (UID: \"f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b\") " pod="openshift-console/downloads-7954f5f757-9ch2w" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434616 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/43baf1d3-85f7-4ce5-9650-260f0803cdab-signing-cabundle\") pod \"service-ca-9c57cc56f-7lxp2\" (UID: \"43baf1d3-85f7-4ce5-9650-260f0803cdab\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434638 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-oauth-serving-cert\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434664 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5sx2\" (UniqueName: \"kubernetes.io/projected/d3c17466-62af-4a4b-bc43-a4eda5d974dd-kube-api-access-g5sx2\") pod \"openshift-config-operator-7777fb866f-49k84\" (UID: \"d3c17466-62af-4a4b-bc43-a4eda5d974dd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434690 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/83d49252-7752-48bc-86b6-c604984cd533-default-certificate\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434729 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83d49252-7752-48bc-86b6-c604984cd533-service-ca-bundle\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434749 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c697ae9b-78ed-4463-9a32-6e6bd1593d70-metrics-tls\") pod \"ingress-operator-5b745b69d9-2v6nv\" (UID: \"c697ae9b-78ed-4463-9a32-6e6bd1593d70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434802 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2012f11e-ed51-4453-8f96-47269823dded-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-znzvh\" (UID: \"2012f11e-ed51-4453-8f96-47269823dded\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434825 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k797x\" (UniqueName: \"kubernetes.io/projected/2012f11e-ed51-4453-8f96-47269823dded-kube-api-access-k797x\") pod \"openshift-controller-manager-operator-756b6f6bc6-znzvh\" (UID: \"2012f11e-ed51-4453-8f96-47269823dded\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434853 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea625b31-78ac-4c2b-8f73-3e5c74894fce-serving-cert\") pod \"route-controller-manager-6576b87f9c-7f29j\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434904 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aadae5cd-e840-4618-b021-d8ca0e9169bd-config-volume\") pod \"collect-profiles-29550780-48757\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.434957 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krplw\" (UniqueName: \"kubernetes.io/projected/894f7c69-0119-4c19-b205-9780fb52b06e-kube-api-access-krplw\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435002 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74fbc42d-490e-49dc-9a95-cff2e3b6d1f6-config\") pod \"kube-apiserver-operator-766d6c64bb-dpzj6\" (UID: \"74fbc42d-490e-49dc-9a95-cff2e3b6d1f6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435030 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ftr\" (UniqueName: \"kubernetes.io/projected/5e17c213-e2c1-4625-8487-2bdb28b0224d-kube-api-access-p4ftr\") pod \"multus-admission-controller-857f4d67dd-m7bwm\" (UID: \"5e17c213-e2c1-4625-8487-2bdb28b0224d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m7bwm" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435053 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/43baf1d3-85f7-4ce5-9650-260f0803cdab-signing-key\") pod \"service-ca-9c57cc56f-7lxp2\" (UID: \"43baf1d3-85f7-4ce5-9650-260f0803cdab\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435097 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d3c17466-62af-4a4b-bc43-a4eda5d974dd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-49k84\" (UID: \"d3c17466-62af-4a4b-bc43-a4eda5d974dd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435123 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c697ae9b-78ed-4463-9a32-6e6bd1593d70-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2v6nv\" (UID: \"c697ae9b-78ed-4463-9a32-6e6bd1593d70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435144 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74fbc42d-490e-49dc-9a95-cff2e3b6d1f6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dpzj6\" (UID: \"74fbc42d-490e-49dc-9a95-cff2e3b6d1f6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435180 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/83d49252-7752-48bc-86b6-c604984cd533-stats-auth\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435204 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aadae5cd-e840-4618-b021-d8ca0e9169bd-secret-volume\") pod \"collect-profiles-29550780-48757\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435226 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flhdw\" (UniqueName: \"kubernetes.io/projected/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-kube-api-access-flhdw\") pod \"service-ca-operator-777779d784-bcj7g\" (UID: \"d2d7cc79-bb68-4db2-9e5a-edd00436b08e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435242 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5fb0f27-68dd-4280-84f5-b46851d6ab96-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435284 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnvxt\" (UniqueName: \"kubernetes.io/projected/c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c-kube-api-access-vnvxt\") pod \"machine-config-operator-74547568cd-7p469\" (UID: \"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435284 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c697ae9b-78ed-4463-9a32-6e6bd1593d70-trusted-ca\") pod \"ingress-operator-5b745b69d9-2v6nv\" (UID: \"c697ae9b-78ed-4463-9a32-6e6bd1593d70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435311 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/432c0903-faf2-4367-b378-730898f2dfcc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m9545\" (UID: \"432c0903-faf2-4367-b378-730898f2dfcc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435339 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5fb0f27-68dd-4280-84f5-b46851d6ab96-config\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-service-ca\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435392 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7p469\" (UID: \"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435430 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2012f11e-ed51-4453-8f96-47269823dded-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-znzvh\" (UID: \"2012f11e-ed51-4453-8f96-47269823dded\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/894f7c69-0119-4c19-b205-9780fb52b06e-console-serving-cert\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435474 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-trusted-ca-bundle\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435496 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea625b31-78ac-4c2b-8f73-3e5c74894fce-config\") pod \"route-controller-manager-6576b87f9c-7f29j\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83d49252-7752-48bc-86b6-c604984cd533-metrics-certs\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435553 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5fb0f27-68dd-4280-84f5-b46851d6ab96-serving-cert\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435574 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-serving-cert\") pod \"service-ca-operator-777779d784-bcj7g\" (UID: \"d2d7cc79-bb68-4db2-9e5a-edd00436b08e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435641 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea625b31-78ac-4c2b-8f73-3e5c74894fce-client-ca\") pod \"route-controller-manager-6576b87f9c-7f29j\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435765 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5fb0f27-68dd-4280-84f5-b46851d6ab96-service-ca-bundle\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435666 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd8kq\" (UniqueName: \"kubernetes.io/projected/43baf1d3-85f7-4ce5-9650-260f0803cdab-kube-api-access-xd8kq\") pod \"service-ca-9c57cc56f-7lxp2\" (UID: \"43baf1d3-85f7-4ce5-9650-260f0803cdab\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435823 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k66nz\" (UniqueName: \"kubernetes.io/projected/ea625b31-78ac-4c2b-8f73-3e5c74894fce-kube-api-access-k66nz\") pod \"route-controller-manager-6576b87f9c-7f29j\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435911 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn88s\" (UniqueName: \"kubernetes.io/projected/d3934c8c-f197-4ef6-ac5c-76560a192e50-kube-api-access-mn88s\") pod \"auto-csr-approver-29550790-q6wbp\" (UID: \"d3934c8c-f197-4ef6-ac5c-76560a192e50\") " pod="openshift-infra/auto-csr-approver-29550790-q6wbp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435944 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/16a87e53-5015-4e1b-bcd9-7dd81a4f6456-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ltjn9\" (UID: \"16a87e53-5015-4e1b-bcd9-7dd81a4f6456\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.435970 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/432c0903-faf2-4367-b378-730898f2dfcc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m9545\" (UID: \"432c0903-faf2-4367-b378-730898f2dfcc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.436004 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slv6j\" (UniqueName: \"kubernetes.io/projected/aadae5cd-e840-4618-b021-d8ca0e9169bd-kube-api-access-slv6j\") pod \"collect-profiles-29550780-48757\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.436029 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-console-config\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.436275 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-oauth-serving-cert\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.436702 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d3c17466-62af-4a4b-bc43-a4eda5d974dd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-49k84\" (UID: \"d3c17466-62af-4a4b-bc43-a4eda5d974dd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.437014 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-console-config\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.439241 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2012f11e-ed51-4453-8f96-47269823dded-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-znzvh\" (UID: \"2012f11e-ed51-4453-8f96-47269823dded\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.439827 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea625b31-78ac-4c2b-8f73-3e5c74894fce-client-ca\") pod \"route-controller-manager-6576b87f9c-7f29j\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.439867 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5fb0f27-68dd-4280-84f5-b46851d6ab96-config\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.440363 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7p469\" (UID: \"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.441255 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-service-ca\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.441817 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea625b31-78ac-4c2b-8f73-3e5c74894fce-serving-cert\") pod \"route-controller-manager-6576b87f9c-7f29j\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.442221 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-trusted-ca-bundle\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.442473 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea625b31-78ac-4c2b-8f73-3e5c74894fce-config\") pod \"route-controller-manager-6576b87f9c-7f29j\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.442962 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74fbc42d-490e-49dc-9a95-cff2e3b6d1f6-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dpzj6\" (UID: \"74fbc42d-490e-49dc-9a95-cff2e3b6d1f6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.443022 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/894f7c69-0119-4c19-b205-9780fb52b06e-console-serving-cert\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.443062 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74fbc42d-490e-49dc-9a95-cff2e3b6d1f6-config\") pod \"kube-apiserver-operator-766d6c64bb-dpzj6\" (UID: \"74fbc42d-490e-49dc-9a95-cff2e3b6d1f6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.443629 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5fb0f27-68dd-4280-84f5-b46851d6ab96-serving-cert\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.444272 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.444641 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c697ae9b-78ed-4463-9a32-6e6bd1593d70-metrics-tls\") pod \"ingress-operator-5b745b69d9-2v6nv\" (UID: \"c697ae9b-78ed-4463-9a32-6e6bd1593d70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.445517 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2012f11e-ed51-4453-8f96-47269823dded-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-znzvh\" (UID: \"2012f11e-ed51-4453-8f96-47269823dded\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.447199 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3c17466-62af-4a4b-bc43-a4eda5d974dd-serving-cert\") pod \"openshift-config-operator-7777fb866f-49k84\" (UID: \"d3c17466-62af-4a4b-bc43-a4eda5d974dd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.449045 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/894f7c69-0119-4c19-b205-9780fb52b06e-console-oauth-config\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.464430 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.486354 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.505333 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.524971 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.529485 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/83d49252-7752-48bc-86b6-c604984cd533-default-certificate\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.545827 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.553811 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/83d49252-7752-48bc-86b6-c604984cd533-stats-auth\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.565444 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.571794 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83d49252-7752-48bc-86b6-c604984cd533-metrics-certs\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.585062 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.586844 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83d49252-7752-48bc-86b6-c604984cd533-service-ca-bundle\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.605632 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.631955 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.645486 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.665714 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.684151 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.706243 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.724658 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.745737 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.765640 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.785195 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.805323 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.825582 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.844645 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.866168 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.884866 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.925144 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.927395 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/432c0903-faf2-4367-b378-730898f2dfcc-config\") pod \"kube-controller-manager-operator-78b949d7b-m9545\" (UID: \"432c0903-faf2-4367-b378-730898f2dfcc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.944516 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.964995 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.973713 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/432c0903-faf2-4367-b378-730898f2dfcc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m9545\" (UID: \"432c0903-faf2-4367-b378-730898f2dfcc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" Mar 09 09:10:29 crc kubenswrapper[4792]: I0309 09:10:29.985802 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.005148 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.006712 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c-images\") pod \"machine-config-operator-74547568cd-7p469\" (UID: \"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.024930 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.044437 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.048224 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c-proxy-tls\") pod \"machine-config-operator-74547568cd-7p469\" (UID: \"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.065049 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.085844 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.104653 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.124834 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.145354 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.165346 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.185353 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.204760 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.225140 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.243408 4792 request.go:700] Waited for 1.018201662s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackage-server-manager-serving-cert&limit=500&resourceVersion=0 Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.245572 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.265116 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.284601 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.306962 4792 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.307052 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7c5109b5-4292-437f-8524-bb7d35147a71-trusted-ca-bundle podName:7c5109b5-4292-437f-8524-bb7d35147a71 nodeName:}" failed. No retries permitted until 2026-03-09 09:10:30.807024574 +0000 UTC m=+195.837225336 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/7c5109b5-4292-437f-8524-bb7d35147a71-trusted-ca-bundle") pod "apiserver-7bbb656c7d-ngdp7" (UID: "7c5109b5-4292-437f-8524-bb7d35147a71") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.313509 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.325773 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.340027 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/16a87e53-5015-4e1b-bcd9-7dd81a4f6456-srv-cert\") pod \"olm-operator-6b444d44fb-ltjn9\" (UID: \"16a87e53-5015-4e1b-bcd9-7dd81a4f6456\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.345844 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.357999 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e17c213-e2c1-4625-8487-2bdb28b0224d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m7bwm\" (UID: \"5e17c213-e2c1-4625-8487-2bdb28b0224d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m7bwm" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.364924 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.394246 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.405032 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.425888 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.435965 4792 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.436242 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-config podName:d2d7cc79-bb68-4db2-9e5a-edd00436b08e nodeName:}" failed. No retries permitted until 2026-03-09 09:10:30.936217856 +0000 UTC m=+195.966418618 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-config") pod "service-ca-operator-777779d784-bcj7g" (UID: "d2d7cc79-bb68-4db2-9e5a-edd00436b08e") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.436276 4792 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.435965 4792 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.437049 4792 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.437337 4792 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.437505 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aadae5cd-e840-4618-b021-d8ca0e9169bd-secret-volume podName:aadae5cd-e840-4618-b021-d8ca0e9169bd nodeName:}" failed. No retries permitted until 2026-03-09 09:10:30.936931966 +0000 UTC m=+195.967132728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/aadae5cd-e840-4618-b021-d8ca0e9169bd-secret-volume") pod "collect-profiles-29550780-48757" (UID: "aadae5cd-e840-4618-b021-d8ca0e9169bd") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.437628 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/43baf1d3-85f7-4ce5-9650-260f0803cdab-signing-cabundle podName:43baf1d3-85f7-4ce5-9650-260f0803cdab nodeName:}" failed. No retries permitted until 2026-03-09 09:10:30.937609535 +0000 UTC m=+195.967810297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/43baf1d3-85f7-4ce5-9650-260f0803cdab-signing-cabundle") pod "service-ca-9c57cc56f-7lxp2" (UID: "43baf1d3-85f7-4ce5-9650-260f0803cdab") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.437746 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43baf1d3-85f7-4ce5-9650-260f0803cdab-signing-key podName:43baf1d3-85f7-4ce5-9650-260f0803cdab nodeName:}" failed. No retries permitted until 2026-03-09 09:10:30.937732079 +0000 UTC m=+195.967932851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/43baf1d3-85f7-4ce5-9650-260f0803cdab-signing-key") pod "service-ca-9c57cc56f-7lxp2" (UID: "43baf1d3-85f7-4ce5-9650-260f0803cdab") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.437874 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aadae5cd-e840-4618-b021-d8ca0e9169bd-config-volume podName:aadae5cd-e840-4618-b021-d8ca0e9169bd nodeName:}" failed. No retries permitted until 2026-03-09 09:10:30.937859792 +0000 UTC m=+195.968060564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/aadae5cd-e840-4618-b021-d8ca0e9169bd-config-volume") pod "collect-profiles-29550780-48757" (UID: "aadae5cd-e840-4618-b021-d8ca0e9169bd") : failed to sync configmap cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.438225 4792 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.438448 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-serving-cert podName:d2d7cc79-bb68-4db2-9e5a-edd00436b08e nodeName:}" failed. No retries permitted until 2026-03-09 09:10:30.938371876 +0000 UTC m=+195.968572628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-serving-cert") pod "service-ca-operator-777779d784-bcj7g" (UID: "d2d7cc79-bb68-4db2-9e5a-edd00436b08e") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.440916 4792 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: E0309 09:10:30.441182 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16a87e53-5015-4e1b-bcd9-7dd81a4f6456-profile-collector-cert podName:16a87e53-5015-4e1b-bcd9-7dd81a4f6456 nodeName:}" failed. No retries permitted until 2026-03-09 09:10:30.941137184 +0000 UTC m=+195.971337976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/16a87e53-5015-4e1b-bcd9-7dd81a4f6456-profile-collector-cert") pod "olm-operator-6b444d44fb-ltjn9" (UID: "16a87e53-5015-4e1b-bcd9-7dd81a4f6456") : failed to sync secret cache: timed out waiting for the condition Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.445739 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.464877 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.485563 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.505201 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.525700 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.545243 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.564607 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.585268 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.605190 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.626166 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.644984 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.666144 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.685247 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.703713 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.724774 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.744576 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.764381 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.829451 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj5dk\" (UniqueName: \"kubernetes.io/projected/e4caea97-6238-4185-9e61-5a40aa699205-kube-api-access-pj5dk\") pod \"cluster-samples-operator-665b6dd947-tw44d\" (UID: \"e4caea97-6238-4185-9e61-5a40aa699205\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.845060 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.845260 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzcrq\" (UniqueName: \"kubernetes.io/projected/7c5109b5-4292-437f-8524-bb7d35147a71-kube-api-access-zzcrq\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.859335 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c5109b5-4292-437f-8524-bb7d35147a71-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.864956 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.884291 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.904905 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.942516 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrsxh\" (UniqueName: \"kubernetes.io/projected/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-kube-api-access-vrsxh\") pod \"controller-manager-879f6c89f-ncpc5\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.958367 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn4bc\" (UniqueName: \"kubernetes.io/projected/f4a19004-24a2-4825-bca2-24d98c3d69cc-kube-api-access-sn4bc\") pod \"console-operator-58897d9998-xllzt\" (UID: \"f4a19004-24a2-4825-bca2-24d98c3d69cc\") " pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.961332 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-config\") pod \"service-ca-operator-777779d784-bcj7g\" (UID: \"d2d7cc79-bb68-4db2-9e5a-edd00436b08e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.961422 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/43baf1d3-85f7-4ce5-9650-260f0803cdab-signing-cabundle\") pod \"service-ca-9c57cc56f-7lxp2\" (UID: \"43baf1d3-85f7-4ce5-9650-260f0803cdab\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.961501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aadae5cd-e840-4618-b021-d8ca0e9169bd-config-volume\") pod \"collect-profiles-29550780-48757\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.961537 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/43baf1d3-85f7-4ce5-9650-260f0803cdab-signing-key\") pod \"service-ca-9c57cc56f-7lxp2\" (UID: \"43baf1d3-85f7-4ce5-9650-260f0803cdab\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.961584 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aadae5cd-e840-4618-b021-d8ca0e9169bd-secret-volume\") pod \"collect-profiles-29550780-48757\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.961649 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-serving-cert\") pod \"service-ca-operator-777779d784-bcj7g\" (UID: \"d2d7cc79-bb68-4db2-9e5a-edd00436b08e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.961703 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/16a87e53-5015-4e1b-bcd9-7dd81a4f6456-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ltjn9\" (UID: \"16a87e53-5015-4e1b-bcd9-7dd81a4f6456\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.962036 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-config\") pod \"service-ca-operator-777779d784-bcj7g\" (UID: \"d2d7cc79-bb68-4db2-9e5a-edd00436b08e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.963139 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/43baf1d3-85f7-4ce5-9650-260f0803cdab-signing-cabundle\") pod \"service-ca-9c57cc56f-7lxp2\" (UID: \"43baf1d3-85f7-4ce5-9650-260f0803cdab\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.964642 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aadae5cd-e840-4618-b021-d8ca0e9169bd-config-volume\") pod \"collect-profiles-29550780-48757\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.966489 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-serving-cert\") pod \"service-ca-operator-777779d784-bcj7g\" (UID: \"d2d7cc79-bb68-4db2-9e5a-edd00436b08e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.966601 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/16a87e53-5015-4e1b-bcd9-7dd81a4f6456-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ltjn9\" (UID: \"16a87e53-5015-4e1b-bcd9-7dd81a4f6456\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.966891 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/43baf1d3-85f7-4ce5-9650-260f0803cdab-signing-key\") pod \"service-ca-9c57cc56f-7lxp2\" (UID: \"43baf1d3-85f7-4ce5-9650-260f0803cdab\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.967865 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aadae5cd-e840-4618-b021-d8ca0e9169bd-secret-volume\") pod \"collect-profiles-29550780-48757\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:30 crc kubenswrapper[4792]: I0309 09:10:30.980679 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkrdg\" (UniqueName: \"kubernetes.io/projected/fc2b2079-7189-4ca3-b398-2a1146b9c70f-kube-api-access-zkrdg\") pod \"machine-api-operator-5694c8668f-56b7z\" (UID: \"fc2b2079-7189-4ca3-b398-2a1146b9c70f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.000278 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf5jw\" (UniqueName: \"kubernetes.io/projected/62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475-kube-api-access-nf5jw\") pod \"apiserver-76f77b778f-l8jxj\" (UID: \"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475\") " pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.004610 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.024474 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.045750 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.079506 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.088017 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.096992 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-5tppx\" (UID: \"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.104885 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj29r\" (UniqueName: \"kubernetes.io/projected/35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8-kube-api-access-xj29r\") pod \"cluster-image-registry-operator-dc59b4c8b-5tppx\" (UID: \"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.117776 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.129125 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.130318 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvkrg\" (UniqueName: \"kubernetes.io/projected/adf7cc5f-5027-4382-bac8-ed4f459fe424-kube-api-access-wvkrg\") pod \"oauth-openshift-558db77b4-tl5jf\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.141249 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6hf6\" (UniqueName: \"kubernetes.io/projected/560a23b3-6492-45d0-b273-df4e85bd9787-kube-api-access-g6hf6\") pod \"machine-approver-56656f9798-8nr7h\" (UID: \"560a23b3-6492-45d0-b273-df4e85bd9787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.149443 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.178753 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g45ns\" (UniqueName: \"kubernetes.io/projected/e37d381c-62ae-489d-888d-320a3f7959cc-kube-api-access-g45ns\") pod \"dns-operator-744455d44c-p5kgd\" (UID: \"e37d381c-62ae-489d-888d-320a3f7959cc\") " pod="openshift-dns-operator/dns-operator-744455d44c-p5kgd" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.185049 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh6wx\" (UniqueName: \"kubernetes.io/projected/530ed995-d9d1-4aff-93b9-9b2a35194cd2-kube-api-access-mh6wx\") pod \"etcd-operator-b45778765-dgsw8\" (UID: \"530ed995-d9d1-4aff-93b9-9b2a35194cd2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.186018 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.186027 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.205760 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.210127 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.230745 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.245198 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.263408 4792 request.go:700] Waited for 1.906106741s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/secrets?fieldSelector=metadata.name%3Dcsi-hostpath-provisioner-sa-dockercfg-qd74k&limit=500&resourceVersion=0 Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.266467 4792 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.284642 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.342406 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8j8z\" (UniqueName: \"kubernetes.io/projected/16a87e53-5015-4e1b-bcd9-7dd81a4f6456-kube-api-access-c8j8z\") pod \"olm-operator-6b444d44fb-ltjn9\" (UID: \"16a87e53-5015-4e1b-bcd9-7dd81a4f6456\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.342689 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4qkv\" (UniqueName: \"kubernetes.io/projected/f5fb0f27-68dd-4280-84f5-b46851d6ab96-kube-api-access-m4qkv\") pod \"authentication-operator-69f744f599-rl7tp\" (UID: \"f5fb0f27-68dd-4280-84f5-b46851d6ab96\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.399363 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.409701 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-p5kgd" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.441781 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74fbc42d-490e-49dc-9a95-cff2e3b6d1f6-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dpzj6\" (UID: \"74fbc42d-490e-49dc-9a95-cff2e3b6d1f6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.444716 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2csp\" (UniqueName: \"kubernetes.io/projected/83d49252-7752-48bc-86b6-c604984cd533-kube-api-access-h2csp\") pod \"router-default-5444994796-kcnhk\" (UID: \"83d49252-7752-48bc-86b6-c604984cd533\") " pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.446003 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krplw\" (UniqueName: \"kubernetes.io/projected/894f7c69-0119-4c19-b205-9780fb52b06e-kube-api-access-krplw\") pod \"console-f9d7485db-jh5pl\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.447556 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5sx2\" (UniqueName: \"kubernetes.io/projected/d3c17466-62af-4a4b-bc43-a4eda5d974dd-kube-api-access-g5sx2\") pod \"openshift-config-operator-7777fb866f-49k84\" (UID: \"d3c17466-62af-4a4b-bc43-a4eda5d974dd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.448607 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7p6q\" (UniqueName: \"kubernetes.io/projected/f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b-kube-api-access-d7p6q\") pod \"downloads-7954f5f757-9ch2w\" (UID: \"f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b\") " pod="openshift-console/downloads-7954f5f757-9ch2w" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.459765 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4ftr\" (UniqueName: \"kubernetes.io/projected/5e17c213-e2c1-4625-8487-2bdb28b0224d-kube-api-access-p4ftr\") pod \"multus-admission-controller-857f4d67dd-m7bwm\" (UID: \"5e17c213-e2c1-4625-8487-2bdb28b0224d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m7bwm" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.479992 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.488547 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.492550 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k797x\" (UniqueName: \"kubernetes.io/projected/2012f11e-ed51-4453-8f96-47269823dded-kube-api-access-k797x\") pod \"openshift-controller-manager-operator-756b6f6bc6-znzvh\" (UID: \"2012f11e-ed51-4453-8f96-47269823dded\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.508593 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k66nz\" (UniqueName: \"kubernetes.io/projected/ea625b31-78ac-4c2b-8f73-3e5c74894fce-kube-api-access-k66nz\") pod \"route-controller-manager-6576b87f9c-7f29j\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.522243 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9ch2w" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.553695 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flhdw\" (UniqueName: \"kubernetes.io/projected/d2d7cc79-bb68-4db2-9e5a-edd00436b08e-kube-api-access-flhdw\") pod \"service-ca-operator-777779d784-bcj7g\" (UID: \"d2d7cc79-bb68-4db2-9e5a-edd00436b08e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.554367 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd8kq\" (UniqueName: \"kubernetes.io/projected/43baf1d3-85f7-4ce5-9650-260f0803cdab-kube-api-access-xd8kq\") pod \"service-ca-9c57cc56f-7lxp2\" (UID: \"43baf1d3-85f7-4ce5-9650-260f0803cdab\") " pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.582839 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" event={"ID":"560a23b3-6492-45d0-b273-df4e85bd9787","Type":"ContainerStarted","Data":"cd031273574f7a90738f921368b8c7fd356cb52d96d80020a24b0a954aeeb178"} Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.582999 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-m7bwm" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.583206 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c697ae9b-78ed-4463-9a32-6e6bd1593d70-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2v6nv\" (UID: \"c697ae9b-78ed-4463-9a32-6e6bd1593d70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.587629 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnvxt\" (UniqueName: \"kubernetes.io/projected/c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c-kube-api-access-vnvxt\") pod \"machine-config-operator-74547568cd-7p469\" (UID: \"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.601483 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.602634 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.609436 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/432c0903-faf2-4367-b378-730898f2dfcc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m9545\" (UID: \"432c0903-faf2-4367-b378-730898f2dfcc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.613037 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.619631 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.620128 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.634403 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5g9c\" (UniqueName: \"kubernetes.io/projected/c697ae9b-78ed-4463-9a32-6e6bd1593d70-kube-api-access-j5g9c\") pod \"ingress-operator-5b745b69d9-2v6nv\" (UID: \"c697ae9b-78ed-4463-9a32-6e6bd1593d70\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.669279 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.669428 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.679883 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn88s\" (UniqueName: \"kubernetes.io/projected/d3934c8c-f197-4ef6-ac5c-76560a192e50-kube-api-access-mn88s\") pod \"auto-csr-approver-29550790-q6wbp\" (UID: \"d3934c8c-f197-4ef6-ac5c-76560a192e50\") " pod="openshift-infra/auto-csr-approver-29550790-q6wbp" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.690149 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slv6j\" (UniqueName: \"kubernetes.io/projected/aadae5cd-e840-4618-b021-d8ca0e9169bd-kube-api-access-slv6j\") pod \"collect-profiles-29550780-48757\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.690225 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.694822 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brxgd\" (UniqueName: \"kubernetes.io/projected/824d014f-b04c-4304-8911-091172950873-kube-api-access-brxgd\") pod \"migrator-59844c95c7-nz8fs\" (UID: \"824d014f-b04c-4304-8911-091172950873\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nz8fs" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.702133 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nz8fs" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.709288 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.713790 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c5109b5-4292-437f-8524-bb7d35147a71-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ngdp7\" (UID: \"7c5109b5-4292-437f-8524-bb7d35147a71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.771555 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.779365 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.812843 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.815463 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6131a74-912d-42f0-82ed-cc2737ef85df-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5wctw\" (UID: \"b6131a74-912d-42f0-82ed-cc2737ef85df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.815514 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xgrs4\" (UID: \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.815571 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5b8826d-81fe-4d43-9177-33e8e34ca003-registry-certificates\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.815620 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0f5e7206-508d-427a-99ec-b7ec404ec804-srv-cert\") pod \"catalog-operator-68c6474976-k2zfx\" (UID: \"0f5e7206-508d-427a-99ec-b7ec404ec804\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.815765 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/965fec55-0c76-48a4-b5a8-0ec1b7a349d1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2mjvb\" (UID: \"965fec55-0c76-48a4-b5a8-0ec1b7a349d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.815827 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16296787-ae4c-4801-aaf3-85c1757f2919-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zsj86\" (UID: \"16296787-ae4c-4801-aaf3-85c1757f2919\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.815896 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k5bc\" (UniqueName: \"kubernetes.io/projected/9eea31d9-692f-4cc8-bb63-4c5d926712bd-kube-api-access-9k5bc\") pod \"packageserver-d55dfcdfc-4vkn2\" (UID: \"9eea31d9-692f-4cc8-bb63-4c5d926712bd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.815976 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flkwq\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-kube-api-access-flkwq\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.816053 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-registry-tls\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.816114 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/965fec55-0c76-48a4-b5a8-0ec1b7a349d1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2mjvb\" (UID: \"965fec55-0c76-48a4-b5a8-0ec1b7a349d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.816177 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16296787-ae4c-4801-aaf3-85c1757f2919-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zsj86\" (UID: \"16296787-ae4c-4801-aaf3-85c1757f2919\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.816226 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f24bba0a-6535-4ad8-8aa7-86a71a268334-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fzxrs\" (UID: \"f24bba0a-6535-4ad8-8aa7-86a71a268334\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.816261 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6131a74-912d-42f0-82ed-cc2737ef85df-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5wctw\" (UID: \"b6131a74-912d-42f0-82ed-cc2737ef85df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.819489 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.819618 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77qcg\" (UniqueName: \"kubernetes.io/projected/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-kube-api-access-77qcg\") pod \"marketplace-operator-79b997595-xgrs4\" (UID: \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.819703 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r58p\" (UniqueName: \"kubernetes.io/projected/b6131a74-912d-42f0-82ed-cc2737ef85df-kube-api-access-5r58p\") pod \"kube-storage-version-migrator-operator-b67b599dd-5wctw\" (UID: \"b6131a74-912d-42f0-82ed-cc2737ef85df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.819732 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5b8826d-81fe-4d43-9177-33e8e34ca003-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.819810 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-548jl\" (UniqueName: \"kubernetes.io/projected/ba27e59f-4cf4-460d-92d1-aa4e522dcfe7-kube-api-access-548jl\") pod \"machine-config-controller-84d6567774-kmnjq\" (UID: \"ba27e59f-4cf4-460d-92d1-aa4e522dcfe7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.819877 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba27e59f-4cf4-460d-92d1-aa4e522dcfe7-proxy-tls\") pod \"machine-config-controller-84d6567774-kmnjq\" (UID: \"ba27e59f-4cf4-460d-92d1-aa4e522dcfe7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.819911 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xgrs4\" (UID: \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.819938 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76hgb\" (UniqueName: \"kubernetes.io/projected/f24bba0a-6535-4ad8-8aa7-86a71a268334-kube-api-access-76hgb\") pod \"control-plane-machine-set-operator-78cbb6b69f-fzxrs\" (UID: \"f24bba0a-6535-4ad8-8aa7-86a71a268334\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.820134 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba27e59f-4cf4-460d-92d1-aa4e522dcfe7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kmnjq\" (UID: \"ba27e59f-4cf4-460d-92d1-aa4e522dcfe7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.820168 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9eea31d9-692f-4cc8-bb63-4c5d926712bd-webhook-cert\") pod \"packageserver-d55dfcdfc-4vkn2\" (UID: \"9eea31d9-692f-4cc8-bb63-4c5d926712bd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.820203 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-bound-sa-token\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.820234 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/965fec55-0c76-48a4-b5a8-0ec1b7a349d1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2mjvb\" (UID: \"965fec55-0c76-48a4-b5a8-0ec1b7a349d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.820278 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/30ab8a3e-ea02-40f9-a91d-5ffef4543f05-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4ckzv\" (UID: \"30ab8a3e-ea02-40f9-a91d-5ffef4543f05\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.820351 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9eea31d9-692f-4cc8-bb63-4c5d926712bd-tmpfs\") pod \"packageserver-d55dfcdfc-4vkn2\" (UID: \"9eea31d9-692f-4cc8-bb63-4c5d926712bd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.820385 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5b8826d-81fe-4d43-9177-33e8e34ca003-trusted-ca\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.820413 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzs2z\" (UniqueName: \"kubernetes.io/projected/0f5e7206-508d-427a-99ec-b7ec404ec804-kube-api-access-tzs2z\") pod \"catalog-operator-68c6474976-k2zfx\" (UID: \"0f5e7206-508d-427a-99ec-b7ec404ec804\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.820484 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d2v8\" (UniqueName: \"kubernetes.io/projected/16296787-ae4c-4801-aaf3-85c1757f2919-kube-api-access-9d2v8\") pod \"openshift-apiserver-operator-796bbdcf4f-zsj86\" (UID: \"16296787-ae4c-4801-aaf3-85c1757f2919\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.820992 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5b8826d-81fe-4d43-9177-33e8e34ca003-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.821232 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9eea31d9-692f-4cc8-bb63-4c5d926712bd-apiservice-cert\") pod \"packageserver-d55dfcdfc-4vkn2\" (UID: \"9eea31d9-692f-4cc8-bb63-4c5d926712bd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.821395 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq4cj\" (UniqueName: \"kubernetes.io/projected/30ab8a3e-ea02-40f9-a91d-5ffef4543f05-kube-api-access-jq4cj\") pod \"package-server-manager-789f6589d5-4ckzv\" (UID: \"30ab8a3e-ea02-40f9-a91d-5ffef4543f05\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.821894 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0f5e7206-508d-427a-99ec-b7ec404ec804-profile-collector-cert\") pod \"catalog-operator-68c6474976-k2zfx\" (UID: \"0f5e7206-508d-427a-99ec-b7ec404ec804\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" Mar 09 09:10:31 crc kubenswrapper[4792]: E0309 09:10:31.823102 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:32.323060885 +0000 UTC m=+197.353261637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.825526 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.826431 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.829160 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-l8jxj"] Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.882604 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tl5jf"] Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.923527 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:31 crc kubenswrapper[4792]: E0309 09:10:31.924110 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:32.424052823 +0000 UTC m=+197.454253585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.924248 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0f5e7206-508d-427a-99ec-b7ec404ec804-profile-collector-cert\") pod \"catalog-operator-68c6474976-k2zfx\" (UID: \"0f5e7206-508d-427a-99ec-b7ec404ec804\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.924293 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-socket-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.924324 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6131a74-912d-42f0-82ed-cc2737ef85df-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5wctw\" (UID: \"b6131a74-912d-42f0-82ed-cc2737ef85df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.924585 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xgrs4\" (UID: \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.925896 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xgrs4\" (UID: \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.924620 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5b8826d-81fe-4d43-9177-33e8e34ca003-registry-certificates\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926013 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0f5e7206-508d-427a-99ec-b7ec404ec804-srv-cert\") pod \"catalog-operator-68c6474976-k2zfx\" (UID: \"0f5e7206-508d-427a-99ec-b7ec404ec804\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926039 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-mountpoint-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926060 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/965fec55-0c76-48a4-b5a8-0ec1b7a349d1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2mjvb\" (UID: \"965fec55-0c76-48a4-b5a8-0ec1b7a349d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926118 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16296787-ae4c-4801-aaf3-85c1757f2919-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zsj86\" (UID: \"16296787-ae4c-4801-aaf3-85c1757f2919\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926149 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c4b8955d-8b1d-4052-b00d-8319fd3bc57d-node-bootstrap-token\") pod \"machine-config-server-g7rwm\" (UID: \"c4b8955d-8b1d-4052-b00d-8319fd3bc57d\") " pod="openshift-machine-config-operator/machine-config-server-g7rwm" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926263 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k5bc\" (UniqueName: \"kubernetes.io/projected/9eea31d9-692f-4cc8-bb63-4c5d926712bd-kube-api-access-9k5bc\") pod \"packageserver-d55dfcdfc-4vkn2\" (UID: \"9eea31d9-692f-4cc8-bb63-4c5d926712bd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926295 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l6gj\" (UniqueName: \"kubernetes.io/projected/8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa-kube-api-access-7l6gj\") pod \"dns-default-4pzr2\" (UID: \"8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa\") " pod="openshift-dns/dns-default-4pzr2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926342 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flkwq\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-kube-api-access-flkwq\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926370 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/366949eb-d94a-4af9-b93b-cdb0cd5678dd-cert\") pod \"ingress-canary-zv9nb\" (UID: \"366949eb-d94a-4af9-b93b-cdb0cd5678dd\") " pod="openshift-ingress-canary/ingress-canary-zv9nb" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926428 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-registry-tls\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926453 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/965fec55-0c76-48a4-b5a8-0ec1b7a349d1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2mjvb\" (UID: \"965fec55-0c76-48a4-b5a8-0ec1b7a349d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926764 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16296787-ae4c-4801-aaf3-85c1757f2919-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zsj86\" (UID: \"16296787-ae4c-4801-aaf3-85c1757f2919\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926865 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f24bba0a-6535-4ad8-8aa7-86a71a268334-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fzxrs\" (UID: \"f24bba0a-6535-4ad8-8aa7-86a71a268334\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926925 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5b8826d-81fe-4d43-9177-33e8e34ca003-registry-certificates\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.926941 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6131a74-912d-42f0-82ed-cc2737ef85df-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5wctw\" (UID: \"b6131a74-912d-42f0-82ed-cc2737ef85df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927211 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa-metrics-tls\") pod \"dns-default-4pzr2\" (UID: \"8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa\") " pod="openshift-dns/dns-default-4pzr2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927276 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927362 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77qcg\" (UniqueName: \"kubernetes.io/projected/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-kube-api-access-77qcg\") pod \"marketplace-operator-79b997595-xgrs4\" (UID: \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927396 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vknjp\" (UniqueName: \"kubernetes.io/projected/c4b8955d-8b1d-4052-b00d-8319fd3bc57d-kube-api-access-vknjp\") pod \"machine-config-server-g7rwm\" (UID: \"c4b8955d-8b1d-4052-b00d-8319fd3bc57d\") " pod="openshift-machine-config-operator/machine-config-server-g7rwm" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927458 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r58p\" (UniqueName: \"kubernetes.io/projected/b6131a74-912d-42f0-82ed-cc2737ef85df-kube-api-access-5r58p\") pod \"kube-storage-version-migrator-operator-b67b599dd-5wctw\" (UID: \"b6131a74-912d-42f0-82ed-cc2737ef85df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927479 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5b8826d-81fe-4d43-9177-33e8e34ca003-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927539 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-548jl\" (UniqueName: \"kubernetes.io/projected/ba27e59f-4cf4-460d-92d1-aa4e522dcfe7-kube-api-access-548jl\") pod \"machine-config-controller-84d6567774-kmnjq\" (UID: \"ba27e59f-4cf4-460d-92d1-aa4e522dcfe7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba27e59f-4cf4-460d-92d1-aa4e522dcfe7-proxy-tls\") pod \"machine-config-controller-84d6567774-kmnjq\" (UID: \"ba27e59f-4cf4-460d-92d1-aa4e522dcfe7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927618 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xgrs4\" (UID: \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927639 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76hgb\" (UniqueName: \"kubernetes.io/projected/f24bba0a-6535-4ad8-8aa7-86a71a268334-kube-api-access-76hgb\") pod \"control-plane-machine-set-operator-78cbb6b69f-fzxrs\" (UID: \"f24bba0a-6535-4ad8-8aa7-86a71a268334\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927695 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngckd\" (UniqueName: \"kubernetes.io/projected/366949eb-d94a-4af9-b93b-cdb0cd5678dd-kube-api-access-ngckd\") pod \"ingress-canary-zv9nb\" (UID: \"366949eb-d94a-4af9-b93b-cdb0cd5678dd\") " pod="openshift-ingress-canary/ingress-canary-zv9nb" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927712 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c4b8955d-8b1d-4052-b00d-8319fd3bc57d-certs\") pod \"machine-config-server-g7rwm\" (UID: \"c4b8955d-8b1d-4052-b00d-8319fd3bc57d\") " pod="openshift-machine-config-operator/machine-config-server-g7rwm" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927750 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba27e59f-4cf4-460d-92d1-aa4e522dcfe7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kmnjq\" (UID: \"ba27e59f-4cf4-460d-92d1-aa4e522dcfe7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927770 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9eea31d9-692f-4cc8-bb63-4c5d926712bd-webhook-cert\") pod \"packageserver-d55dfcdfc-4vkn2\" (UID: \"9eea31d9-692f-4cc8-bb63-4c5d926712bd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927791 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-bound-sa-token\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927811 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/965fec55-0c76-48a4-b5a8-0ec1b7a349d1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2mjvb\" (UID: \"965fec55-0c76-48a4-b5a8-0ec1b7a349d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927835 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-csi-data-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927860 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/30ab8a3e-ea02-40f9-a91d-5ffef4543f05-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4ckzv\" (UID: \"30ab8a3e-ea02-40f9-a91d-5ffef4543f05\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927880 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-plugins-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927906 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa-config-volume\") pod \"dns-default-4pzr2\" (UID: \"8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa\") " pod="openshift-dns/dns-default-4pzr2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927962 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9eea31d9-692f-4cc8-bb63-4c5d926712bd-tmpfs\") pod \"packageserver-d55dfcdfc-4vkn2\" (UID: \"9eea31d9-692f-4cc8-bb63-4c5d926712bd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.927991 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5b8826d-81fe-4d43-9177-33e8e34ca003-trusted-ca\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.928009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzs2z\" (UniqueName: \"kubernetes.io/projected/0f5e7206-508d-427a-99ec-b7ec404ec804-kube-api-access-tzs2z\") pod \"catalog-operator-68c6474976-k2zfx\" (UID: \"0f5e7206-508d-427a-99ec-b7ec404ec804\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.928036 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d2v8\" (UniqueName: \"kubernetes.io/projected/16296787-ae4c-4801-aaf3-85c1757f2919-kube-api-access-9d2v8\") pod \"openshift-apiserver-operator-796bbdcf4f-zsj86\" (UID: \"16296787-ae4c-4801-aaf3-85c1757f2919\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.928055 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-registration-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.928117 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg7bv\" (UniqueName: \"kubernetes.io/projected/73189c5a-2649-4a99-9f15-d0e7cddea5ea-kube-api-access-qg7bv\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.928137 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5b8826d-81fe-4d43-9177-33e8e34ca003-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.928153 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9eea31d9-692f-4cc8-bb63-4c5d926712bd-apiservice-cert\") pod \"packageserver-d55dfcdfc-4vkn2\" (UID: \"9eea31d9-692f-4cc8-bb63-4c5d926712bd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.928171 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq4cj\" (UniqueName: \"kubernetes.io/projected/30ab8a3e-ea02-40f9-a91d-5ffef4543f05-kube-api-access-jq4cj\") pod \"package-server-manager-789f6589d5-4ckzv\" (UID: \"30ab8a3e-ea02-40f9-a91d-5ffef4543f05\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" Mar 09 09:10:31 crc kubenswrapper[4792]: E0309 09:10:31.934695 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:32.434625534 +0000 UTC m=+197.464826486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.938608 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-registry-tls\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.949133 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6131a74-912d-42f0-82ed-cc2737ef85df-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5wctw\" (UID: \"b6131a74-912d-42f0-82ed-cc2737ef85df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.952510 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0f5e7206-508d-427a-99ec-b7ec404ec804-profile-collector-cert\") pod \"catalog-operator-68c6474976-k2zfx\" (UID: \"0f5e7206-508d-427a-99ec-b7ec404ec804\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.954998 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.956405 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16296787-ae4c-4801-aaf3-85c1757f2919-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zsj86\" (UID: \"16296787-ae4c-4801-aaf3-85c1757f2919\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.963125 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f24bba0a-6535-4ad8-8aa7-86a71a268334-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fzxrs\" (UID: \"f24bba0a-6535-4ad8-8aa7-86a71a268334\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.964529 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba27e59f-4cf4-460d-92d1-aa4e522dcfe7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kmnjq\" (UID: \"ba27e59f-4cf4-460d-92d1-aa4e522dcfe7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.968097 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/30ab8a3e-ea02-40f9-a91d-5ffef4543f05-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4ckzv\" (UID: \"30ab8a3e-ea02-40f9-a91d-5ffef4543f05\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.968647 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba27e59f-4cf4-460d-92d1-aa4e522dcfe7-proxy-tls\") pod \"machine-config-controller-84d6567774-kmnjq\" (UID: \"ba27e59f-4cf4-460d-92d1-aa4e522dcfe7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.972983 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5b8826d-81fe-4d43-9177-33e8e34ca003-trusted-ca\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.973945 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5b8826d-81fe-4d43-9177-33e8e34ca003-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.974745 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/965fec55-0c76-48a4-b5a8-0ec1b7a349d1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2mjvb\" (UID: \"965fec55-0c76-48a4-b5a8-0ec1b7a349d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.975793 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx"] Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.976038 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9eea31d9-692f-4cc8-bb63-4c5d926712bd-webhook-cert\") pod \"packageserver-d55dfcdfc-4vkn2\" (UID: \"9eea31d9-692f-4cc8-bb63-4c5d926712bd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.978785 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550790-q6wbp" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.982386 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-p5kgd"] Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.992826 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xgrs4\" (UID: \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.993052 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5b8826d-81fe-4d43-9177-33e8e34ca003-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.993279 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16296787-ae4c-4801-aaf3-85c1757f2919-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zsj86\" (UID: \"16296787-ae4c-4801-aaf3-85c1757f2919\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.993324 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ncpc5"] Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.994599 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9eea31d9-692f-4cc8-bb63-4c5d926712bd-apiservice-cert\") pod \"packageserver-d55dfcdfc-4vkn2\" (UID: \"9eea31d9-692f-4cc8-bb63-4c5d926712bd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:31 crc kubenswrapper[4792]: I0309 09:10:31.994839 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/965fec55-0c76-48a4-b5a8-0ec1b7a349d1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2mjvb\" (UID: \"965fec55-0c76-48a4-b5a8-0ec1b7a349d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:31.995922 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:31.996402 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6131a74-912d-42f0-82ed-cc2737ef85df-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5wctw\" (UID: \"b6131a74-912d-42f0-82ed-cc2737ef85df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:31.997228 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0f5e7206-508d-427a-99ec-b7ec404ec804-srv-cert\") pod \"catalog-operator-68c6474976-k2zfx\" (UID: \"0f5e7206-508d-427a-99ec-b7ec404ec804\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:31.999483 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flkwq\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-kube-api-access-flkwq\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.002650 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9eea31d9-692f-4cc8-bb63-4c5d926712bd-tmpfs\") pod \"packageserver-d55dfcdfc-4vkn2\" (UID: \"9eea31d9-692f-4cc8-bb63-4c5d926712bd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.004510 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq4cj\" (UniqueName: \"kubernetes.io/projected/30ab8a3e-ea02-40f9-a91d-5ffef4543f05-kube-api-access-jq4cj\") pod \"package-server-manager-789f6589d5-4ckzv\" (UID: \"30ab8a3e-ea02-40f9-a91d-5ffef4543f05\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.015913 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xllzt"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.016453 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-56b7z"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.021513 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dgsw8"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.030709 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/965fec55-0c76-48a4-b5a8-0ec1b7a349d1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2mjvb\" (UID: \"965fec55-0c76-48a4-b5a8-0ec1b7a349d1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.035781 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k5bc\" (UniqueName: \"kubernetes.io/projected/9eea31d9-692f-4cc8-bb63-4c5d926712bd-kube-api-access-9k5bc\") pod \"packageserver-d55dfcdfc-4vkn2\" (UID: \"9eea31d9-692f-4cc8-bb63-4c5d926712bd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.041416 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:32 crc kubenswrapper[4792]: E0309 09:10:32.044476 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:32.544437223 +0000 UTC m=+197.574637975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.044568 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-csi-data-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.044636 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-plugins-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.044675 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa-config-volume\") pod \"dns-default-4pzr2\" (UID: \"8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa\") " pod="openshift-dns/dns-default-4pzr2" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.044744 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-registration-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.044779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg7bv\" (UniqueName: \"kubernetes.io/projected/73189c5a-2649-4a99-9f15-d0e7cddea5ea-kube-api-access-qg7bv\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.044813 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-socket-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.044893 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-mountpoint-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.044919 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c4b8955d-8b1d-4052-b00d-8319fd3bc57d-node-bootstrap-token\") pod \"machine-config-server-g7rwm\" (UID: \"c4b8955d-8b1d-4052-b00d-8319fd3bc57d\") " pod="openshift-machine-config-operator/machine-config-server-g7rwm" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.044941 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l6gj\" (UniqueName: \"kubernetes.io/projected/8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa-kube-api-access-7l6gj\") pod \"dns-default-4pzr2\" (UID: \"8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa\") " pod="openshift-dns/dns-default-4pzr2" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.044989 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/366949eb-d94a-4af9-b93b-cdb0cd5678dd-cert\") pod \"ingress-canary-zv9nb\" (UID: \"366949eb-d94a-4af9-b93b-cdb0cd5678dd\") " pod="openshift-ingress-canary/ingress-canary-zv9nb" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.045052 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa-metrics-tls\") pod \"dns-default-4pzr2\" (UID: \"8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa\") " pod="openshift-dns/dns-default-4pzr2" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.045095 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.045140 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vknjp\" (UniqueName: \"kubernetes.io/projected/c4b8955d-8b1d-4052-b00d-8319fd3bc57d-kube-api-access-vknjp\") pod \"machine-config-server-g7rwm\" (UID: \"c4b8955d-8b1d-4052-b00d-8319fd3bc57d\") " pod="openshift-machine-config-operator/machine-config-server-g7rwm" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.045260 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngckd\" (UniqueName: \"kubernetes.io/projected/366949eb-d94a-4af9-b93b-cdb0cd5678dd-kube-api-access-ngckd\") pod \"ingress-canary-zv9nb\" (UID: \"366949eb-d94a-4af9-b93b-cdb0cd5678dd\") " pod="openshift-ingress-canary/ingress-canary-zv9nb" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.045279 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c4b8955d-8b1d-4052-b00d-8319fd3bc57d-certs\") pod \"machine-config-server-g7rwm\" (UID: \"c4b8955d-8b1d-4052-b00d-8319fd3bc57d\") " pod="openshift-machine-config-operator/machine-config-server-g7rwm" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.045859 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-mountpoint-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.047319 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-csi-data-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.047765 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-plugins-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.048476 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa-config-volume\") pod \"dns-default-4pzr2\" (UID: \"8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa\") " pod="openshift-dns/dns-default-4pzr2" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.048536 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-registration-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.048699 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/73189c5a-2649-4a99-9f15-d0e7cddea5ea-socket-dir\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.048854 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c4b8955d-8b1d-4052-b00d-8319fd3bc57d-certs\") pod \"machine-config-server-g7rwm\" (UID: \"c4b8955d-8b1d-4052-b00d-8319fd3bc57d\") " pod="openshift-machine-config-operator/machine-config-server-g7rwm" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.052523 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c4b8955d-8b1d-4052-b00d-8319fd3bc57d-node-bootstrap-token\") pod \"machine-config-server-g7rwm\" (UID: \"c4b8955d-8b1d-4052-b00d-8319fd3bc57d\") " pod="openshift-machine-config-operator/machine-config-server-g7rwm" Mar 09 09:10:32 crc kubenswrapper[4792]: E0309 09:10:32.055610 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:32.555585549 +0000 UTC m=+197.585786291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.056363 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/366949eb-d94a-4af9-b93b-cdb0cd5678dd-cert\") pod \"ingress-canary-zv9nb\" (UID: \"366949eb-d94a-4af9-b93b-cdb0cd5678dd\") " pod="openshift-ingress-canary/ingress-canary-zv9nb" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.065850 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77qcg\" (UniqueName: \"kubernetes.io/projected/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-kube-api-access-77qcg\") pod \"marketplace-operator-79b997595-xgrs4\" (UID: \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\") " pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.087171 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa-metrics-tls\") pod \"dns-default-4pzr2\" (UID: \"8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa\") " pod="openshift-dns/dns-default-4pzr2" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.092666 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r58p\" (UniqueName: \"kubernetes.io/projected/b6131a74-912d-42f0-82ed-cc2737ef85df-kube-api-access-5r58p\") pod \"kube-storage-version-migrator-operator-b67b599dd-5wctw\" (UID: \"b6131a74-912d-42f0-82ed-cc2737ef85df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.096387 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.122433 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.136990 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76hgb\" (UniqueName: \"kubernetes.io/projected/f24bba0a-6535-4ad8-8aa7-86a71a268334-kube-api-access-76hgb\") pod \"control-plane-machine-set-operator-78cbb6b69f-fzxrs\" (UID: \"f24bba0a-6535-4ad8-8aa7-86a71a268334\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.141903 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-548jl\" (UniqueName: \"kubernetes.io/projected/ba27e59f-4cf4-460d-92d1-aa4e522dcfe7-kube-api-access-548jl\") pod \"machine-config-controller-84d6567774-kmnjq\" (UID: \"ba27e59f-4cf4-460d-92d1-aa4e522dcfe7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.142431 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.147539 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:32 crc kubenswrapper[4792]: E0309 09:10:32.148020 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:32.647999754 +0000 UTC m=+197.678200506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.155156 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzs2z\" (UniqueName: \"kubernetes.io/projected/0f5e7206-508d-427a-99ec-b7ec404ec804-kube-api-access-tzs2z\") pod \"catalog-operator-68c6474976-k2zfx\" (UID: \"0f5e7206-508d-427a-99ec-b7ec404ec804\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.155477 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.167057 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.173944 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d2v8\" (UniqueName: \"kubernetes.io/projected/16296787-ae4c-4801-aaf3-85c1757f2919-kube-api-access-9d2v8\") pod \"openshift-apiserver-operator-796bbdcf4f-zsj86\" (UID: \"16296787-ae4c-4801-aaf3-85c1757f2919\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.183870 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.194651 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.196616 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-bound-sa-token\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.208116 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7lxp2"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.212745 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg7bv\" (UniqueName: \"kubernetes.io/projected/73189c5a-2649-4a99-9f15-d0e7cddea5ea-kube-api-access-qg7bv\") pod \"csi-hostpathplugin-8kksk\" (UID: \"73189c5a-2649-4a99-9f15-d0e7cddea5ea\") " pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.236136 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.259899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:32 crc kubenswrapper[4792]: E0309 09:10:32.260338 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:32.760325396 +0000 UTC m=+197.790526148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:32 crc kubenswrapper[4792]: W0309 09:10:32.260424 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4a19004_24a2_4825_bca2_24d98c3d69cc.slice/crio-76c3850f2bffec6088b91fdbb390f90af1ab3b6623e313d34b43e7d1105518db WatchSource:0}: Error finding container 76c3850f2bffec6088b91fdbb390f90af1ab3b6623e313d34b43e7d1105518db: Status 404 returned error can't find the container with id 76c3850f2bffec6088b91fdbb390f90af1ab3b6623e313d34b43e7d1105518db Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.268303 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngckd\" (UniqueName: \"kubernetes.io/projected/366949eb-d94a-4af9-b93b-cdb0cd5678dd-kube-api-access-ngckd\") pod \"ingress-canary-zv9nb\" (UID: \"366949eb-d94a-4af9-b93b-cdb0cd5678dd\") " pod="openshift-ingress-canary/ingress-canary-zv9nb" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.273114 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zv9nb" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.281173 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vknjp\" (UniqueName: \"kubernetes.io/projected/c4b8955d-8b1d-4052-b00d-8319fd3bc57d-kube-api-access-vknjp\") pod \"machine-config-server-g7rwm\" (UID: \"c4b8955d-8b1d-4052-b00d-8319fd3bc57d\") " pod="openshift-machine-config-operator/machine-config-server-g7rwm" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.282381 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9ch2w"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.296680 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-8kksk" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.297295 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l6gj\" (UniqueName: \"kubernetes.io/projected/8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa-kube-api-access-7l6gj\") pod \"dns-default-4pzr2\" (UID: \"8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa\") " pod="openshift-dns/dns-default-4pzr2" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.364235 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:32 crc kubenswrapper[4792]: E0309 09:10:32.364543 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:32.864527495 +0000 UTC m=+197.894728247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.375443 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.381210 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rl7tp"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.404009 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.428609 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9"] Mar 09 09:10:32 crc kubenswrapper[4792]: W0309 09:10:32.441079 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf39d8cd3_a63e_4aeb_9609_fe4c8ed6372b.slice/crio-55316d4efb7e8b376f35c1b4d14f251fbbd44e51ca0f7ad7124702b093350bc2 WatchSource:0}: Error finding container 55316d4efb7e8b376f35c1b4d14f251fbbd44e51ca0f7ad7124702b093350bc2: Status 404 returned error can't find the container with id 55316d4efb7e8b376f35c1b4d14f251fbbd44e51ca0f7ad7124702b093350bc2 Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.468983 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m7bwm"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.475466 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:32 crc kubenswrapper[4792]: E0309 09:10:32.478311 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:32.978290366 +0000 UTC m=+198.008491118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.578648 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:32 crc kubenswrapper[4792]: E0309 09:10:32.579037 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:33.079000467 +0000 UTC m=+198.109201209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.579047 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g7rwm" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.581139 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:32 crc kubenswrapper[4792]: E0309 09:10:32.581723 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:33.081707224 +0000 UTC m=+198.111907976 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.585699 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4pzr2" Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.632395 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-49k84"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.688860 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:32 crc kubenswrapper[4792]: E0309 09:10:32.689119 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:33.189035983 +0000 UTC m=+198.219236735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.689202 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:32 crc kubenswrapper[4792]: E0309 09:10:32.689553 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:33.189545707 +0000 UTC m=+198.219746459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.724237 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" event={"ID":"530ed995-d9d1-4aff-93b9-9b2a35194cd2","Type":"ContainerStarted","Data":"5900e1272f3e6edc9ddcdc9c820e01d0572808a92199bb3c46ca961748ace66a"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.749491 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9ch2w" event={"ID":"f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b","Type":"ContainerStarted","Data":"55316d4efb7e8b376f35c1b4d14f251fbbd44e51ca0f7ad7124702b093350bc2"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.751413 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nz8fs"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.766576 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" event={"ID":"adf7cc5f-5027-4382-bac8-ed4f459fe424","Type":"ContainerStarted","Data":"9405041339e4573a3ac71f68d5a4b4c87a47d2f1689e2c7a912e5e916b4ea2d9"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.792621 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" event={"ID":"fc2b2079-7189-4ca3-b398-2a1146b9c70f","Type":"ContainerStarted","Data":"95ce405596f05da44d608fcf17620d4831d321f25f952df0f3f3809054809084"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.793815 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:32 crc kubenswrapper[4792]: E0309 09:10:32.794233 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:33.294216321 +0000 UTC m=+198.324417073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.797232 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.815481 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" event={"ID":"d2d7cc79-bb68-4db2-9e5a-edd00436b08e","Type":"ContainerStarted","Data":"cd24852f3a3ba5a827c2d08bc28ad2c8a569b17f9cae1dc828f6c77808ec7a81"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.819018 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jh5pl"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.828592 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" event={"ID":"43baf1d3-85f7-4ce5-9650-260f0803cdab","Type":"ContainerStarted","Data":"182ff7de8d80277bc68e00bbb8476701b9a5ce46848d0dc661c19ab9d42479d7"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.830466 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" event={"ID":"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8","Type":"ContainerStarted","Data":"0aa9bc62184069a08b8e0909a165b23db3fa8a041066e1f55153268c8eaf8458"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.836479 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" event={"ID":"f5fb0f27-68dd-4280-84f5-b46851d6ab96","Type":"ContainerStarted","Data":"9b34999f0bd6930ffb56a52cf6c90adeebc398c7b9cdd6efb43bbf2e86bcb5d4"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.837813 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xllzt" event={"ID":"f4a19004-24a2-4825-bca2-24d98c3d69cc","Type":"ContainerStarted","Data":"76c3850f2bffec6088b91fdbb390f90af1ab3b6623e313d34b43e7d1105518db"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.838973 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" event={"ID":"560a23b3-6492-45d0-b273-df4e85bd9787","Type":"ContainerStarted","Data":"f29622fe2e179afac6152042fb66fd7f1a0fdee159e40cc38bae151cff6ad2d0"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.851704 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" event={"ID":"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475","Type":"ContainerStarted","Data":"13524d7d243e10bdb6dc8f5e89600a19f68144551ed74b459df5c1a5f3c84c5d"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.865954 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" event={"ID":"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c","Type":"ContainerStarted","Data":"1e726d8f8ccf856eab5b946e783e8cb96a090b12291d43ce0ec6719be3bfc59f"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.896919 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:32 crc kubenswrapper[4792]: E0309 09:10:32.897322 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:33.397308628 +0000 UTC m=+198.427509380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.904554 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.911479 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d" event={"ID":"e4caea97-6238-4185-9e61-5a40aa699205","Type":"ContainerStarted","Data":"7eb9ce56cdbcce46790c31dd34e7efb939169367e7d64bc10e658bba0599f4ce"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.946688 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p5kgd" event={"ID":"e37d381c-62ae-489d-888d-320a3f7959cc","Type":"ContainerStarted","Data":"1e97b6b5192afb12b2df1d52a09f5524214eb852010938d1302d36a7761164da"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.971565 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kcnhk" event={"ID":"83d49252-7752-48bc-86b6-c604984cd533","Type":"ContainerStarted","Data":"6c87a9e111477d9660fdeef5f8cd7863e08b970b0f54a8e3872613a9389ec685"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.971621 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kcnhk" event={"ID":"83d49252-7752-48bc-86b6-c604984cd533","Type":"ContainerStarted","Data":"a272a9258ed4a263a132d2ab8cf96a551e0c6c294c2551458151c2dbc5139474"} Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.983657 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv"] Mar 09 09:10:32 crc kubenswrapper[4792]: I0309 09:10:32.998846 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:33 crc kubenswrapper[4792]: E0309 09:10:32.999841 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:33.49979819 +0000 UTC m=+198.529998942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:33 crc kubenswrapper[4792]: W0309 09:10:33.043080 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3c17466_62af_4a4b_bc43_a4eda5d974dd.slice/crio-0a9dc965d7e41f0caa10bc22a2444136b362fb53109fc851a9f611d66c182554 WatchSource:0}: Error finding container 0a9dc965d7e41f0caa10bc22a2444136b362fb53109fc851a9f611d66c182554: Status 404 returned error can't find the container with id 0a9dc965d7e41f0caa10bc22a2444136b362fb53109fc851a9f611d66c182554 Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.073793 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgrs4"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.074663 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550780-48757"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.101011 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:33 crc kubenswrapper[4792]: E0309 09:10:33.101407 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:33.601391895 +0000 UTC m=+198.631592647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:33 crc kubenswrapper[4792]: W0309 09:10:33.207309 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod824d014f_b04c_4304_8911_091172950873.slice/crio-e1fe56a14ff0e1e4360e734d5f45473a7ce74695dff63a918731da49f06c9825 WatchSource:0}: Error finding container e1fe56a14ff0e1e4360e734d5f45473a7ce74695dff63a918731da49f06c9825: Status 404 returned error can't find the container with id e1fe56a14ff0e1e4360e734d5f45473a7ce74695dff63a918731da49f06c9825 Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.207912 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:33 crc kubenswrapper[4792]: E0309 09:10:33.208547 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:33.708527309 +0000 UTC m=+198.738728061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.238513 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.331203 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:33 crc kubenswrapper[4792]: E0309 09:10:33.332571 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:33.832553781 +0000 UTC m=+198.862754543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.357338 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550790-q6wbp"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.357883 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7p469"] Mar 09 09:10:33 crc kubenswrapper[4792]: W0309 09:10:33.389876 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaadae5cd_e840_4618_b021_d8ca0e9169bd.slice/crio-5a6a741ce6a435148ea4ae418febdf572a7a2abb621ec146ab9ae5f1704b94a7 WatchSource:0}: Error finding container 5a6a741ce6a435148ea4ae418febdf572a7a2abb621ec146ab9ae5f1704b94a7: Status 404 returned error can't find the container with id 5a6a741ce6a435148ea4ae418febdf572a7a2abb621ec146ab9ae5f1704b94a7 Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.428362 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.434272 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:33 crc kubenswrapper[4792]: E0309 09:10:33.437705 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:33.937680507 +0000 UTC m=+198.967881259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.444014 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.466762 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.496428 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.497006 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:33 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:33 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:33 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.497247 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.538683 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.540357 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:33 crc kubenswrapper[4792]: E0309 09:10:33.540721 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:34.040705284 +0000 UTC m=+199.070906046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.552110 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.642203 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:33 crc kubenswrapper[4792]: E0309 09:10:33.642895 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:34.142877056 +0000 UTC m=+199.173077808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.659499 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.695831 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.746685 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:33 crc kubenswrapper[4792]: E0309 09:10:33.747105 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:34.247086386 +0000 UTC m=+199.277287138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.756054 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.780969 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-8kksk"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.795502 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zv9nb"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.802364 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.808180 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv"] Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.871966 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:33 crc kubenswrapper[4792]: E0309 09:10:33.872642 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:34.372614362 +0000 UTC m=+199.402815114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.891582 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-kcnhk" podStartSLOduration=155.891556751 podStartE2EDuration="2m35.891556751s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:33.884000395 +0000 UTC m=+198.914201147" watchObservedRunningTime="2026-03-09 09:10:33.891556751 +0000 UTC m=+198.921757503" Mar 09 09:10:33 crc kubenswrapper[4792]: I0309 09:10:33.973670 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:33 crc kubenswrapper[4792]: E0309 09:10:33.974193 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:34.474176527 +0000 UTC m=+199.504377279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:33 crc kubenswrapper[4792]: W0309 09:10:33.996674 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod965fec55_0c76_48a4_b5a8_0ec1b7a349d1.slice/crio-b0df703b1ef280530387f86d0ded473a04dc627a8d89706e56652ad85bf19c4b WatchSource:0}: Error finding container b0df703b1ef280530387f86d0ded473a04dc627a8d89706e56652ad85bf19c4b: Status 404 returned error can't find the container with id b0df703b1ef280530387f86d0ded473a04dc627a8d89706e56652ad85bf19c4b Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.023663 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m7bwm" event={"ID":"5e17c213-e2c1-4625-8487-2bdb28b0224d","Type":"ContainerStarted","Data":"7ec8ab7bc41ca9a53e536980e9643a216d52c29bd70a341208765a45ddd97a7f"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.039334 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" event={"ID":"0f5e7206-508d-427a-99ec-b7ec404ec804","Type":"ContainerStarted","Data":"a8fce27a686062cb9a4ef587d66475aa31ca73149acd054ca82fe5d6c21e094d"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.076030 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:34 crc kubenswrapper[4792]: E0309 09:10:34.078199 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:34.57815104 +0000 UTC m=+199.608351832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.084975 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550790-q6wbp" event={"ID":"d3934c8c-f197-4ef6-ac5c-76560a192e50","Type":"ContainerStarted","Data":"3cc6753455e1631f985f3dfdaab258d74d9a79eb9299f20d3e62c1428b0b3989"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.102545 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4pzr2"] Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.119989 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" event={"ID":"c697ae9b-78ed-4463-9a32-6e6bd1593d70","Type":"ContainerStarted","Data":"21d0e58faa6cd5f2506095aba72c2bc576c500a779701b01909dab2b3760de62"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.123454 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" event={"ID":"2012f11e-ed51-4453-8f96-47269823dded","Type":"ContainerStarted","Data":"23be8b4a6557fb6be26be6ea0f00349c3e096ab50f2743ac13ccf65c3bd04e4a"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.125873 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" event={"ID":"9eea31d9-692f-4cc8-bb63-4c5d926712bd","Type":"ContainerStarted","Data":"0f3c9033cd519798d2d302c6772b8c47d11d471fa356cb7a9782930cb39d6975"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.128136 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" event={"ID":"d3c17466-62af-4a4b-bc43-a4eda5d974dd","Type":"ContainerStarted","Data":"0a9dc965d7e41f0caa10bc22a2444136b362fb53109fc851a9f611d66c182554"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.129499 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" event={"ID":"ba27e59f-4cf4-460d-92d1-aa4e522dcfe7","Type":"ContainerStarted","Data":"e7706a432144852082fa4dfc55c83f46e1ed58e1a22a235ac113572c1de117c7"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.132221 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xllzt" event={"ID":"f4a19004-24a2-4825-bca2-24d98c3d69cc","Type":"ContainerStarted","Data":"c2e8a7a0f0b5731580e8730f07c9cd84c16d7ccff95fa0fd97f46971302b0c9e"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.132548 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.136643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" event={"ID":"b6131a74-912d-42f0-82ed-cc2737ef85df","Type":"ContainerStarted","Data":"5166bbc9455ebd09b350ed73e58d83551ccf1a80a9735e7bfae0933740cb1e98"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.137851 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" event={"ID":"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c","Type":"ContainerStarted","Data":"fa8c1cbbd885aa6ba4365b791331c93453a0cea3241d70637e64ed55d70195a9"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.148807 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" event={"ID":"35d5435d-5c5f-4c5c-bca6-d2f5a89c25d8","Type":"ContainerStarted","Data":"27061e2f6e8497dbdf946dd93eb80428489554687fd322df033eaae46f251511"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.150266 4792 patch_prober.go:28] interesting pod/console-operator-58897d9998-xllzt container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.150323 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-xllzt" podUID="f4a19004-24a2-4825-bca2-24d98c3d69cc" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.155339 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-xllzt" podStartSLOduration=156.155317382 podStartE2EDuration="2m36.155317382s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:34.153341406 +0000 UTC m=+199.183542158" watchObservedRunningTime="2026-03-09 09:10:34.155317382 +0000 UTC m=+199.185518134" Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.160736 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" event={"ID":"7c5109b5-4292-437f-8524-bb7d35147a71","Type":"ContainerStarted","Data":"f05dabd220cd327577f865b20befd1e58f1da8355d0d941c11d0f9e603df8a9a"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.165170 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" event={"ID":"30ab8a3e-ea02-40f9-a91d-5ffef4543f05","Type":"ContainerStarted","Data":"f87b84212dcd68fc7aaf2ef43676d4077877f36f847b96e914ae487f6615c0ae"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.166099 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" event={"ID":"ea625b31-78ac-4c2b-8f73-3e5c74894fce","Type":"ContainerStarted","Data":"5a931101a6e4300f456c349bd1a62314a1bb033b841ed1b33234504975c05fb3"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.169249 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jh5pl" event={"ID":"894f7c69-0119-4c19-b205-9780fb52b06e","Type":"ContainerStarted","Data":"c0c53cdb88646265b987d527b693d1063543d83446fd0feeac46711ec26bab27"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.171465 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs" event={"ID":"f24bba0a-6535-4ad8-8aa7-86a71a268334","Type":"ContainerStarted","Data":"595e92b12e385f645e767da20d443b256302d6301792eb57c26294afe7ceddf2"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.175273 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8kksk" event={"ID":"73189c5a-2649-4a99-9f15-d0e7cddea5ea","Type":"ContainerStarted","Data":"0482b9b0b9ae6db12099b503b05087d631bc250b3ffe66f5f75b7c10f9dc52f7"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.179923 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:34 crc kubenswrapper[4792]: E0309 09:10:34.183656 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:34.683640857 +0000 UTC m=+199.713841609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.190906 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-5tppx" podStartSLOduration=156.190881732 podStartE2EDuration="2m36.190881732s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:34.182604937 +0000 UTC m=+199.212805689" watchObservedRunningTime="2026-03-09 09:10:34.190881732 +0000 UTC m=+199.221082484" Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.197047 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zv9nb" event={"ID":"366949eb-d94a-4af9-b93b-cdb0cd5678dd","Type":"ContainerStarted","Data":"cd7cbd346fa08aa607279e51da6826fcf92c13d8994856c3e1eb9cb31618131a"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.216702 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" event={"ID":"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd","Type":"ContainerStarted","Data":"5fea6fd1132bfc00969a6558a6ed8706318829514fdd79249675c0c2c8d0b8dd"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.222378 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" event={"ID":"16a87e53-5015-4e1b-bcd9-7dd81a4f6456","Type":"ContainerStarted","Data":"5a7e6b5bf3fb05937716f4c6b01198f71cfffea77084c71e02f89bfe563e5f6e"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.226943 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" event={"ID":"f5fb0f27-68dd-4280-84f5-b46851d6ab96","Type":"ContainerStarted","Data":"25dcd88d98584de7c979053725876fcae569c1f1f8e089a6b3e496fc61841647"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.231044 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" event={"ID":"432c0903-faf2-4367-b378-730898f2dfcc","Type":"ContainerStarted","Data":"520deb82bcea6d8535c587edf029e87a291eb16b86489475163b6130660ec4bd"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.234458 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" podStartSLOduration=156.234445089 podStartE2EDuration="2m36.234445089s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:34.206864177 +0000 UTC m=+199.237064929" watchObservedRunningTime="2026-03-09 09:10:34.234445089 +0000 UTC m=+199.264645841" Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.236434 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nz8fs" event={"ID":"824d014f-b04c-4304-8911-091172950873","Type":"ContainerStarted","Data":"e1fe56a14ff0e1e4360e734d5f45473a7ce74695dff63a918731da49f06c9825"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.236684 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86"] Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.241169 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" event={"ID":"aadae5cd-e840-4618-b021-d8ca0e9169bd","Type":"ContainerStarted","Data":"5a6a741ce6a435148ea4ae418febdf572a7a2abb621ec146ab9ae5f1704b94a7"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.243044 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" event={"ID":"74fbc42d-490e-49dc-9a95-cff2e3b6d1f6","Type":"ContainerStarted","Data":"067095a070bfdbce7fd93434e1ff6d79e15737d68866621cd33260c873079784"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.244714 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g7rwm" event={"ID":"c4b8955d-8b1d-4052-b00d-8319fd3bc57d","Type":"ContainerStarted","Data":"2b14b1fdd80aa76183b2e83b309cba4676a6a8e29a3074647c5bfbf9d8087c64"} Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.251125 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-rl7tp" podStartSLOduration=157.251111013 podStartE2EDuration="2m37.251111013s" podCreationTimestamp="2026-03-09 09:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:34.248261902 +0000 UTC m=+199.278462654" watchObservedRunningTime="2026-03-09 09:10:34.251111013 +0000 UTC m=+199.281311765" Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.281204 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:34 crc kubenswrapper[4792]: E0309 09:10:34.283057 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:34.782993109 +0000 UTC m=+199.813193861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:34 crc kubenswrapper[4792]: W0309 09:10:34.290491 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ef9ef9e_4f63_4e33_b0dc_9de707eed9fa.slice/crio-71059f0bcbdbeefae38803ca5493573e73ac649c1bc4839e8c74914aff561308 WatchSource:0}: Error finding container 71059f0bcbdbeefae38803ca5493573e73ac649c1bc4839e8c74914aff561308: Status 404 returned error can't find the container with id 71059f0bcbdbeefae38803ca5493573e73ac649c1bc4839e8c74914aff561308 Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.382183 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:34 crc kubenswrapper[4792]: E0309 09:10:34.382624 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:34.882610458 +0000 UTC m=+199.912811210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.483283 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:34 crc kubenswrapper[4792]: E0309 09:10:34.483604 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:34.983571506 +0000 UTC m=+200.013772258 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.485111 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:34 crc kubenswrapper[4792]: E0309 09:10:34.485657 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:34.985617524 +0000 UTC m=+200.015818276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.505395 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:34 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:34 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:34 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.505513 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.586732 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:34 crc kubenswrapper[4792]: E0309 09:10:34.586938 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:35.086900531 +0000 UTC m=+200.117101283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.587006 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:34 crc kubenswrapper[4792]: E0309 09:10:34.587594 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:35.087578041 +0000 UTC m=+200.117778793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.687814 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:34 crc kubenswrapper[4792]: E0309 09:10:34.688272 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:35.188247 +0000 UTC m=+200.218447752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.790742 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:34 crc kubenswrapper[4792]: E0309 09:10:34.791250 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:35.291233075 +0000 UTC m=+200.321433827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:34 crc kubenswrapper[4792]: I0309 09:10:34.896316 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:34 crc kubenswrapper[4792]: E0309 09:10:34.897306 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:35.397271707 +0000 UTC m=+200.427472459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:35 crc kubenswrapper[4792]: E0309 09:10:35.000711 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:35.500690135 +0000 UTC m=+200.530890897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.001599 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.102860 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:35 crc kubenswrapper[4792]: E0309 09:10:35.104105 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:35.604085962 +0000 UTC m=+200.634286704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.206005 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:35 crc kubenswrapper[4792]: E0309 09:10:35.206564 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:35.706546462 +0000 UTC m=+200.736747224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.286327 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" event={"ID":"560a23b3-6492-45d0-b273-df4e85bd9787","Type":"ContainerStarted","Data":"b4a4089fee79f1cc7387bf26aac2dbcfd728bb01269fc4bbe12d29f6c1601a40"} Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.288599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" event={"ID":"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c","Type":"ContainerStarted","Data":"8cc3d0b79b97547f3cb7be6705c399b698dd0fbe9c97c04f0f0aace53bcdc02e"} Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.295723 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" event={"ID":"adf7cc5f-5027-4382-bac8-ed4f459fe424","Type":"ContainerStarted","Data":"df541305cc4f686f0e6f186740638f6e30455f4640fd7094d291d655a91ef57a"} Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.296210 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.307837 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:35 crc kubenswrapper[4792]: E0309 09:10:35.308723 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:35.808691794 +0000 UTC m=+200.838892546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.313584 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" event={"ID":"530ed995-d9d1-4aff-93b9-9b2a35194cd2","Type":"ContainerStarted","Data":"ef6119e3ed1b9954458d4fa3264a1a2e6368ee2a11c64e2de043096dd9bd4c43"} Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.321523 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs" event={"ID":"f24bba0a-6535-4ad8-8aa7-86a71a268334","Type":"ContainerStarted","Data":"cbea620753f92b727d2560040355b94805c43184fa870397a5eb1fc4a2883e52"} Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.327349 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.340806 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8nr7h" podStartSLOduration=158.340780575 podStartE2EDuration="2m38.340780575s" podCreationTimestamp="2026-03-09 09:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:35.334541158 +0000 UTC m=+200.364741910" watchObservedRunningTime="2026-03-09 09:10:35.340780575 +0000 UTC m=+200.370981327" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.347199 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4vkn2 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.347254 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" podUID="9eea31d9-692f-4cc8-bb63-4c5d926712bd" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.409940 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.412412 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9ch2w" event={"ID":"f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b","Type":"ContainerStarted","Data":"7659989e34eb3821dc68064b76f43920341f57f36e74ef7b7b6ac672b321d065"} Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.413660 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-9ch2w" Mar 09 09:10:35 crc kubenswrapper[4792]: E0309 09:10:35.414334 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:35.914315064 +0000 UTC m=+200.944515816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.414469 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-9ch2w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.414506 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9ch2w" podUID="f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.433973 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" podStartSLOduration=158.433954921 podStartE2EDuration="2m38.433954921s" podCreationTimestamp="2026-03-09 09:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:35.390101826 +0000 UTC m=+200.420302578" watchObservedRunningTime="2026-03-09 09:10:35.433954921 +0000 UTC m=+200.464155673" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.466568 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" event={"ID":"ea625b31-78ac-4c2b-8f73-3e5c74894fce","Type":"ContainerStarted","Data":"36134bcc4edd8cefaf2ef68607c266ba37a2d30ce05b79f1ac713d688301ec95"} Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.468059 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.497564 4792 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-7f29j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.497645 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" podUID="ea625b31-78ac-4c2b-8f73-3e5c74894fce" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.505093 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:35 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:35 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:35 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.505149 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.508235 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fzxrs" podStartSLOduration=157.5082061 podStartE2EDuration="2m37.5082061s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:35.434143067 +0000 UTC m=+200.464343819" watchObservedRunningTime="2026-03-09 09:10:35.5082061 +0000 UTC m=+200.538406852" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.508377 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-dgsw8" podStartSLOduration=157.508372115 podStartE2EDuration="2m37.508372115s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:35.505508124 +0000 UTC m=+200.535708876" watchObservedRunningTime="2026-03-09 09:10:35.508372115 +0000 UTC m=+200.538572857" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.526487 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:35 crc kubenswrapper[4792]: E0309 09:10:35.528177 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:36.028149997 +0000 UTC m=+201.058350749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.553379 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" event={"ID":"ba27e59f-4cf4-460d-92d1-aa4e522dcfe7","Type":"ContainerStarted","Data":"c7b867fbc7471afe0fc8f8648ccc8d839a376fcc8e2cc8d1eae6c517ab6b2360"} Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.572626 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" podStartSLOduration=157.57260806 podStartE2EDuration="2m37.57260806s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:35.570124939 +0000 UTC m=+200.600325691" watchObservedRunningTime="2026-03-09 09:10:35.57260806 +0000 UTC m=+200.602808812" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.602193 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" event={"ID":"16a87e53-5015-4e1b-bcd9-7dd81a4f6456","Type":"ContainerStarted","Data":"37e429f0e1bd805f30e12a3fc0684f19041a235e1dbce749212702b26a385a3e"} Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.605050 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.614267 4792 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-ltjn9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.614310 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" podUID="16a87e53-5015-4e1b-bcd9-7dd81a4f6456" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.628219 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:35 crc kubenswrapper[4792]: E0309 09:10:35.628908 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:36.128873638 +0000 UTC m=+201.159074400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.638968 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-9ch2w" podStartSLOduration=157.638951365 podStartE2EDuration="2m37.638951365s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:35.602765747 +0000 UTC m=+200.632966499" watchObservedRunningTime="2026-03-09 09:10:35.638951365 +0000 UTC m=+200.669152107" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.640188 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" podStartSLOduration=157.640183909 podStartE2EDuration="2m37.640183909s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:35.639016947 +0000 UTC m=+200.669217699" watchObservedRunningTime="2026-03-09 09:10:35.640183909 +0000 UTC m=+200.670384661" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.699860 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" podStartSLOduration=157.699836494 podStartE2EDuration="2m37.699836494s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:35.691649521 +0000 UTC m=+200.721850273" watchObservedRunningTime="2026-03-09 09:10:35.699836494 +0000 UTC m=+200.730037246" Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.729607 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:35 crc kubenswrapper[4792]: E0309 09:10:35.731346 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:36.231327999 +0000 UTC m=+201.261528751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.784294 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g7rwm" event={"ID":"c4b8955d-8b1d-4052-b00d-8319fd3bc57d","Type":"ContainerStarted","Data":"06a2e4bc2cd1745469c6e4d34202453c9fa7f5bb04f45bdecff117f15e530b65"} Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.784337 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d" event={"ID":"e4caea97-6238-4185-9e61-5a40aa699205","Type":"ContainerStarted","Data":"015d207018a9d25307a4cf30bca2b99a882a3a090b7179e872b665fef3bd1ff5"} Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.831225 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:35 crc kubenswrapper[4792]: E0309 09:10:35.831679 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:36.331664218 +0000 UTC m=+201.361864970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.935892 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:35 crc kubenswrapper[4792]: E0309 09:10:35.941079 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:36.441028164 +0000 UTC m=+201.471228916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.961122 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" event={"ID":"16296787-ae4c-4801-aaf3-85c1757f2919","Type":"ContainerStarted","Data":"3e6a95fe6bf6313fa70c966795ea9ce12283969c3c352d35ae8388af493cb09a"} Mar 09 09:10:35 crc kubenswrapper[4792]: I0309 09:10:35.983878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p5kgd" event={"ID":"e37d381c-62ae-489d-888d-320a3f7959cc","Type":"ContainerStarted","Data":"3daeccc4f216d23108ff2bf35b10979a7695ba2215d503c0236ac14f7b6473ab"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.001488 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.007974 4792 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-k2zfx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.008036 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" podUID="0f5e7206-508d-427a-99ec-b7ec404ec804" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.011736 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" event={"ID":"aadae5cd-e840-4618-b021-d8ca0e9169bd","Type":"ContainerStarted","Data":"d104d2dc86f9cb4cd3a61c24abc309fe0728bfe4b26d245e57c4e0a1793f065c"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.022477 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" event={"ID":"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c","Type":"ContainerStarted","Data":"a1b2308ae9ee926601eb462bfc9a895cf553904e21f338d888a164d495f8977d"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.023979 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.036950 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:36 crc kubenswrapper[4792]: E0309 09:10:36.039432 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:36.53941037 +0000 UTC m=+201.569611122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.079972 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jh5pl" event={"ID":"894f7c69-0119-4c19-b205-9780fb52b06e","Type":"ContainerStarted","Data":"fe99716a26831a4feb1cab8ab0e7ab21cfc6e045d9fc4351aa58551f5f881b40"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.091881 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7lxp2" event={"ID":"43baf1d3-85f7-4ce5-9650-260f0803cdab","Type":"ContainerStarted","Data":"d5fe9d91306b9f8f33c5618905a71e41144adccffb8eb9282bdbda8823ac56ef"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.117211 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" event={"ID":"fc2b2079-7189-4ca3-b398-2a1146b9c70f","Type":"ContainerStarted","Data":"7a286747d41af4e521ff0a8a792d2b8955240ff7da763a735ec9e786c41086da"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.134435 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" event={"ID":"d3c17466-62af-4a4b-bc43-a4eda5d974dd","Type":"ContainerStarted","Data":"b4dea7a0fef4e479de8b47346564e542f121218fe44b7303535565b526e58586"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.138143 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:36 crc kubenswrapper[4792]: E0309 09:10:36.139344 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:36.639328548 +0000 UTC m=+201.669529300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.150523 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.191659 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" event={"ID":"d2d7cc79-bb68-4db2-9e5a-edd00436b08e","Type":"ContainerStarted","Data":"209d3ef06da3e3ae8a39a9ae25c556c11954722cc33799bbc1037957dc9bead4"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.218243 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4pzr2" event={"ID":"8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa","Type":"ContainerStarted","Data":"71059f0bcbdbeefae38803ca5493573e73ac649c1bc4839e8c74914aff561308"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.240296 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:36 crc kubenswrapper[4792]: E0309 09:10:36.241766 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:36.741752857 +0000 UTC m=+201.771953609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.244930 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" event={"ID":"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd","Type":"ContainerStarted","Data":"356800b3e1ca7e31efdd5336d28b223e15989f4947eff2afd70a93b27f45c61c"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.251665 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.258190 4792 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xgrs4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.258248 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" podUID="488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.263657 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" event={"ID":"30ab8a3e-ea02-40f9-a91d-5ffef4543f05","Type":"ContainerStarted","Data":"5653f55a6411efe0631b1bb1dee7c2bdcc11b48a9b8c0ccd49723a1486fe83df"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.308583 4792 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-tl5jf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.308631 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" podUID="adf7cc5f-5027-4382-bac8-ed4f459fe424" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.319625 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" event={"ID":"c697ae9b-78ed-4463-9a32-6e6bd1593d70","Type":"ContainerStarted","Data":"446369c3f159fa8d1bb1a4258f034653168042df150caf3d747cb430ad8e4be2"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.343151 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:36 crc kubenswrapper[4792]: E0309 09:10:36.344647 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:36.84462836 +0000 UTC m=+201.874829112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.348860 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" event={"ID":"965fec55-0c76-48a4-b5a8-0ec1b7a349d1","Type":"ContainerStarted","Data":"b0df703b1ef280530387f86d0ded473a04dc627a8d89706e56652ad85bf19c4b"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.373610 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" event={"ID":"2012f11e-ed51-4453-8f96-47269823dded","Type":"ContainerStarted","Data":"6addc356795ea8768ff14117878199714cb4ad89343ad34fe2df4ffae8df1b76"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.378037 4792 generic.go:334] "Generic (PLEG): container finished" podID="62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475" containerID="bea68b4a71daa1cf17e60c2b4a60527bd10ba856587e9b82070a89b6fe6d6a94" exitCode=0 Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.378156 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" event={"ID":"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475","Type":"ContainerDied","Data":"bea68b4a71daa1cf17e60c2b4a60527bd10ba856587e9b82070a89b6fe6d6a94"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.416848 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nz8fs" event={"ID":"824d014f-b04c-4304-8911-091172950873","Type":"ContainerStarted","Data":"342ace951161a85285402b6e6c8c1829f33087daf038a9d21a43cd17ea375a12"} Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.444858 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:36 crc kubenswrapper[4792]: E0309 09:10:36.445259 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:36.945244777 +0000 UTC m=+201.975445529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.500830 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:36 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:36 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:36 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.500882 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.553171 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:36 crc kubenswrapper[4792]: E0309 09:10:36.553961 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:37.053939434 +0000 UTC m=+202.084140186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.554416 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:36 crc kubenswrapper[4792]: E0309 09:10:36.556035 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:37.056027574 +0000 UTC m=+202.086228326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.648843 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-g7rwm" podStartSLOduration=8.64881857 podStartE2EDuration="8.64881857s" podCreationTimestamp="2026-03-09 09:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:36.648595124 +0000 UTC m=+201.678795876" watchObservedRunningTime="2026-03-09 09:10:36.64881857 +0000 UTC m=+201.679019322" Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.671748 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:36 crc kubenswrapper[4792]: E0309 09:10:36.672317 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:37.172297737 +0000 UTC m=+202.202498489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.689363 4792 ???:1] "http: TLS handshake error from 192.168.126.11:44806: no serving certificate available for the kubelet" Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.783041 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:36 crc kubenswrapper[4792]: E0309 09:10:36.783510 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:37.283485435 +0000 UTC m=+202.313686187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.896378 4792 ???:1] "http: TLS handshake error from 192.168.126.11:44812: no serving certificate available for the kubelet" Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.898118 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:36 crc kubenswrapper[4792]: E0309 09:10:36.898659 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:37.398637395 +0000 UTC m=+202.428838147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:36 crc kubenswrapper[4792]: I0309 09:10:36.909286 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-xllzt" Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:36.993609 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bcj7g" podStartSLOduration=158.993583382 podStartE2EDuration="2m38.993583382s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:36.989869657 +0000 UTC m=+202.020070409" watchObservedRunningTime="2026-03-09 09:10:36.993583382 +0000 UTC m=+202.023784134" Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.000230 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:37 crc kubenswrapper[4792]: E0309 09:10:37.000740 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:37.500714655 +0000 UTC m=+202.530915417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.101936 4792 ???:1] "http: TLS handshake error from 192.168.126.11:44822: no serving certificate available for the kubelet" Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.102315 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:37 crc kubenswrapper[4792]: E0309 09:10:37.102503 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:37.602462565 +0000 UTC m=+202.632663317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.102607 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:37 crc kubenswrapper[4792]: E0309 09:10:37.103060 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:37.603051692 +0000 UTC m=+202.633252444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.205670 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:37 crc kubenswrapper[4792]: E0309 09:10:37.206180 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:37.706154621 +0000 UTC m=+202.736355373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.307641 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:37 crc kubenswrapper[4792]: E0309 09:10:37.308167 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:37.808142878 +0000 UTC m=+202.838343630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.410047 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:37 crc kubenswrapper[4792]: E0309 09:10:37.410469 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:37.910451584 +0000 UTC m=+202.940652336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.464556 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" event={"ID":"ba27e59f-4cf4-460d-92d1-aa4e522dcfe7","Type":"ContainerStarted","Data":"dd3c33de1f16a86cd147f7469d956a55662ee4417250d05b997533443848a21e"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.480671 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m7bwm" event={"ID":"5e17c213-e2c1-4625-8487-2bdb28b0224d","Type":"ContainerStarted","Data":"b023516aaf31b0b937275c5561ff18f5c686a68da1991e4d82db3001e35011dc"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.480725 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m7bwm" event={"ID":"5e17c213-e2c1-4625-8487-2bdb28b0224d","Type":"ContainerStarted","Data":"3a51fb5d0d0e37fad38ac04a594dcde1609b197bacfe5ab7732dd9d4cb5a958f"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.499753 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" event={"ID":"432c0903-faf2-4367-b378-730898f2dfcc","Type":"ContainerStarted","Data":"843545f1ac71da585859de8ff71a4177cd4a3f837272dc4b5a61b510ddb9198d"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.500450 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:37 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:37 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:37 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.500491 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.511509 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:37 crc kubenswrapper[4792]: E0309 09:10:37.513882 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:38.013857781 +0000 UTC m=+203.044058523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.518928 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" event={"ID":"c952f4ee-a8f1-4c7a-bfd4-0da6e1f0ff7c","Type":"ContainerStarted","Data":"6339d0261ade941eb917eabc496f8afd934c92b2dd5946139687a2e83f5e80d3"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.539932 4792 generic.go:334] "Generic (PLEG): container finished" podID="d3c17466-62af-4a4b-bc43-a4eda5d974dd" containerID="b4dea7a0fef4e479de8b47346564e542f121218fe44b7303535565b526e58586" exitCode=0 Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.540014 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" event={"ID":"d3c17466-62af-4a4b-bc43-a4eda5d974dd","Type":"ContainerDied","Data":"b4dea7a0fef4e479de8b47346564e542f121218fe44b7303535565b526e58586"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.540048 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" event={"ID":"d3c17466-62af-4a4b-bc43-a4eda5d974dd","Type":"ContainerStarted","Data":"83b17b634f041ab9040d6365e89161cabacdd75ad8b33fc24b8fbbcbe57a2864"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.540764 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.549197 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" event={"ID":"9eea31d9-692f-4cc8-bb63-4c5d926712bd","Type":"ContainerStarted","Data":"46047d941539a3f83b3c8092c3d48f27781e0c66c7c23b5029051dab3a97667e"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.564956 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nz8fs" event={"ID":"824d014f-b04c-4304-8911-091172950873","Type":"ContainerStarted","Data":"2dddc761618e5c3b31ed479481302a9db9a16b3c336774951b85ad2cbc60882f"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.592511 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" event={"ID":"b6131a74-912d-42f0-82ed-cc2737ef85df","Type":"ContainerStarted","Data":"cd694439ab811dca1a979cc405845d168d2f47683eaf4a7e168485290c1e6815"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.596438 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" event={"ID":"0f5e7206-508d-427a-99ec-b7ec404ec804","Type":"ContainerStarted","Data":"6bc89c9e5ebf1e5e083b6d3880ba7ed9b27857e25c8544d5603530fd1d02b218"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.611423 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d" event={"ID":"e4caea97-6238-4185-9e61-5a40aa699205","Type":"ContainerStarted","Data":"0efad6b66b52a60ba59cf64d69331b682d44000d2b7ffcbccc0e8f82d643700b"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.612770 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:37 crc kubenswrapper[4792]: E0309 09:10:37.614419 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:38.114398746 +0000 UTC m=+203.144599498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.620455 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4pzr2" event={"ID":"8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa","Type":"ContainerStarted","Data":"e03f79575b5f00e1663a691315e02e9ff9877aa98d872d1d5d3d69804e989f46"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.620519 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4pzr2" event={"ID":"8ef9ef9e-4f63-4e33-b0dc-9de707eed9fa","Type":"ContainerStarted","Data":"aec324e1afce7495ae757f5735de6ffc91520c8d5e03ab94a6242d17cf2f4ae8"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.621325 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4pzr2" Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.634446 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.637865 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" event={"ID":"74fbc42d-490e-49dc-9a95-cff2e3b6d1f6","Type":"ContainerStarted","Data":"e407be87e77691f4ecd0c15a4418ec5c0c0b9bda304a7d4244b2c399b575194c"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.648280 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" event={"ID":"16296787-ae4c-4801-aaf3-85c1757f2919","Type":"ContainerStarted","Data":"91fc82492c0bc588e137eed32c9440afd6550e0d4e05e2893b958313eb329c34"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.690185 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-p5kgd" event={"ID":"e37d381c-62ae-489d-888d-320a3f7959cc","Type":"ContainerStarted","Data":"210df1d347d6cf2991bcbf489d70c2a176b80d0409bbda0a0c534bdab39e19b9"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.715236 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.720590 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" event={"ID":"c697ae9b-78ed-4463-9a32-6e6bd1593d70","Type":"ContainerStarted","Data":"9223271014a3a4a8128e5177e7575a50a8589d6f484b42a3279e42e2b881ee86"} Mar 09 09:10:37 crc kubenswrapper[4792]: E0309 09:10:37.721008 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:38.220983195 +0000 UTC m=+203.251183947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.746274 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" podStartSLOduration=159.746253222 podStartE2EDuration="2m39.746253222s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:37.4029599 +0000 UTC m=+202.433160642" watchObservedRunningTime="2026-03-09 09:10:37.746253222 +0000 UTC m=+202.776453974" Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.755796 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8kksk" event={"ID":"73189c5a-2649-4a99-9f15-d0e7cddea5ea","Type":"ContainerStarted","Data":"5cc9e7fe6b7760a3cf35fd6606df7a29211bc377684469646761ba9d78a07afa"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.816841 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:37 crc kubenswrapper[4792]: E0309 09:10:37.818457 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:38.318434532 +0000 UTC m=+203.348635284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.823014 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" event={"ID":"fc2b2079-7189-4ca3-b398-2a1146b9c70f","Type":"ContainerStarted","Data":"483c8caed51384406509b7f18f5e03fa61bef2f07330aa4c643e4f5eecdd683e"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.825424 4792 generic.go:334] "Generic (PLEG): container finished" podID="7c5109b5-4292-437f-8524-bb7d35147a71" containerID="e4c68b079ab15cbe0caa95d701a9174837ffe30b42482746d789910cd840253d" exitCode=0 Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.826318 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" event={"ID":"7c5109b5-4292-437f-8524-bb7d35147a71","Type":"ContainerDied","Data":"e4c68b079ab15cbe0caa95d701a9174837ffe30b42482746d789910cd840253d"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.841018 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" event={"ID":"30ab8a3e-ea02-40f9-a91d-5ffef4543f05","Type":"ContainerStarted","Data":"42531aef34f608a234d4b4671e8b7e590e93d10303632a164cd40f636faaa963"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.841061 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.867791 4792 ???:1] "http: TLS handshake error from 192.168.126.11:44832: no serving certificate available for the kubelet" Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.883170 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zv9nb" event={"ID":"366949eb-d94a-4af9-b93b-cdb0cd5678dd","Type":"ContainerStarted","Data":"b950ce6e3eb6229a86c17a0b8e4138cf04a76c4d99b3307d8cc1b435d1f8454f"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.925915 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:37 crc kubenswrapper[4792]: E0309 09:10:37.931507 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:38.431480973 +0000 UTC m=+203.461681725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.960737 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" event={"ID":"965fec55-0c76-48a4-b5a8-0ec1b7a349d1","Type":"ContainerStarted","Data":"3efe4117b3155827e015e8e191d8176b7e8ec89e94e1b4508e84167c3391e9d4"} Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.960990 4792 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xgrs4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.961042 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" podUID="488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.962704 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-9ch2w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.962738 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9ch2w" podUID="f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 09:10:37 crc kubenswrapper[4792]: I0309 09:10:37.989344 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.014602 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.028551 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:38 crc kubenswrapper[4792]: E0309 09:10:38.030827 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:38.530807115 +0000 UTC m=+203.561007867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.039501 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5wctw" podStartSLOduration=160.039477991 podStartE2EDuration="2m40.039477991s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:37.782002157 +0000 UTC m=+202.812202909" watchObservedRunningTime="2026-03-09 09:10:38.039477991 +0000 UTC m=+203.069678743" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.042128 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ltjn9" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.049639 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-jh5pl" podStartSLOduration=160.049610078 podStartE2EDuration="2m40.049610078s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:38.037512075 +0000 UTC m=+203.067712817" watchObservedRunningTime="2026-03-09 09:10:38.049610078 +0000 UTC m=+203.079810830" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.141472 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:38 crc kubenswrapper[4792]: E0309 09:10:38.141834 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:38.641822908 +0000 UTC m=+203.672023660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.242175 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:38 crc kubenswrapper[4792]: E0309 09:10:38.242633 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:38.742614141 +0000 UTC m=+203.772814893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.284879 4792 ???:1] "http: TLS handshake error from 192.168.126.11:44836: no serving certificate available for the kubelet" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.343697 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:38 crc kubenswrapper[4792]: E0309 09:10:38.344166 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:38.844152425 +0000 UTC m=+203.874353177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.362513 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2v6nv" podStartSLOduration=160.362492426 podStartE2EDuration="2m40.362492426s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:38.214338968 +0000 UTC m=+203.244539720" watchObservedRunningTime="2026-03-09 09:10:38.362492426 +0000 UTC m=+203.392693178" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.446017 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:38 crc kubenswrapper[4792]: E0309 09:10:38.446207 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:38.946169003 +0000 UTC m=+203.976369755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.446337 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:38 crc kubenswrapper[4792]: E0309 09:10:38.446695 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:38.946681057 +0000 UTC m=+203.976881809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.450049 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-56b7z" podStartSLOduration=160.450021452 podStartE2EDuration="2m40.450021452s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:38.374807066 +0000 UTC m=+203.405007818" watchObservedRunningTime="2026-03-09 09:10:38.450021452 +0000 UTC m=+203.480222194" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.500653 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:38 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:38 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:38 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.501103 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.547135 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:38 crc kubenswrapper[4792]: E0309 09:10:38.547524 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:39.047506721 +0000 UTC m=+204.077707473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.550805 4792 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4vkn2 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.550861 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" podUID="9eea31d9-692f-4cc8-bb63-4c5d926712bd" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.609644 4792 ???:1] "http: TLS handshake error from 192.168.126.11:44850: no serving certificate available for the kubelet" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.649841 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:38 crc kubenswrapper[4792]: E0309 09:10:38.650308 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:39.150295051 +0000 UTC m=+204.180495803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.671822 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-znzvh" podStartSLOduration=160.671799092 podStartE2EDuration="2m40.671799092s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:38.514580826 +0000 UTC m=+203.544781578" watchObservedRunningTime="2026-03-09 09:10:38.671799092 +0000 UTC m=+203.701999854" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.671977 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" podStartSLOduration=161.671973357 podStartE2EDuration="2m41.671973357s" podCreationTimestamp="2026-03-09 09:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:38.645455874 +0000 UTC m=+203.675656626" watchObservedRunningTime="2026-03-09 09:10:38.671973357 +0000 UTC m=+203.702174109" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.727273 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k2zfx" podStartSLOduration=160.727247767 podStartE2EDuration="2m40.727247767s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:38.721873004 +0000 UTC m=+203.752073756" watchObservedRunningTime="2026-03-09 09:10:38.727247767 +0000 UTC m=+203.757448519" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.751589 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:38 crc kubenswrapper[4792]: E0309 09:10:38.752013 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:39.251974509 +0000 UTC m=+204.282175261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.822931 4792 ???:1] "http: TLS handshake error from 192.168.126.11:44858: no serving certificate available for the kubelet" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.847270 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zv9nb" podStartSLOduration=10.847241215 podStartE2EDuration="10.847241215s" podCreationTimestamp="2026-03-09 09:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:38.774451258 +0000 UTC m=+203.804652010" watchObservedRunningTime="2026-03-09 09:10:38.847241215 +0000 UTC m=+203.877441967" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.854572 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:38 crc kubenswrapper[4792]: E0309 09:10:38.854965 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:39.354951805 +0000 UTC m=+204.385152567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.955305 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:38 crc kubenswrapper[4792]: E0309 09:10:38.955765 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:39.455747127 +0000 UTC m=+204.485947879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.969110 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" podStartSLOduration=160.969086316 podStartE2EDuration="2m40.969086316s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:38.955984074 +0000 UTC m=+203.986184826" watchObservedRunningTime="2026-03-09 09:10:38.969086316 +0000 UTC m=+203.999287068" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.970128 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dpzj6" podStartSLOduration=160.970123006 podStartE2EDuration="2m40.970123006s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:38.849593542 +0000 UTC m=+203.879794294" watchObservedRunningTime="2026-03-09 09:10:38.970123006 +0000 UTC m=+204.000323758" Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.992872 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" event={"ID":"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475","Type":"ContainerStarted","Data":"d7a910bf99f4a271c943b2df6cb20df6c1cf139e1a3285ceee26c539ca0ab7f9"} Mar 09 09:10:38 crc kubenswrapper[4792]: I0309 09:10:38.992919 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" event={"ID":"62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475","Type":"ContainerStarted","Data":"57cdfe01ee5fae3bd2f35ef133ff0c8cf056db4f9215ad1a5fe3247e0bf31d0b"} Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.002806 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" event={"ID":"7c5109b5-4292-437f-8524-bb7d35147a71","Type":"ContainerStarted","Data":"38a750276b581a265fd3d932702732df26dedbd2b91272a406722fac58878ef5"} Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.006762 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-9ch2w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.006811 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9ch2w" podUID="f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.057850 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.070295 4792 ???:1] "http: TLS handshake error from 192.168.126.11:44860: no serving certificate available for the kubelet" Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.072735 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:39.57271493 +0000 UTC m=+204.602915682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.159215 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.159605 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:39.659583807 +0000 UTC m=+204.689784559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.260450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.260823 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:39.760810742 +0000 UTC m=+204.791011494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.369861 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.373024 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:39.872982398 +0000 UTC m=+204.903183150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.375490 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.376140 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:39.876124008 +0000 UTC m=+204.906324760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.381446 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4vkn2" Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.480763 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.481013 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:39.980969386 +0000 UTC m=+205.011170148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.481170 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.481553 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:39.981536152 +0000 UTC m=+205.011736904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.500668 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:39 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:39 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:39 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.500752 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.521473 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tw44d" podStartSLOduration=161.521453586 podStartE2EDuration="2m41.521453586s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:39.167550354 +0000 UTC m=+204.197751106" watchObservedRunningTime="2026-03-09 09:10:39.521453586 +0000 UTC m=+204.551654328" Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.582998 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.583386 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:40.083364785 +0000 UTC m=+205.113565537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.600595 4792 ???:1] "http: TLS handshake error from 192.168.126.11:44870: no serving certificate available for the kubelet" Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.617930 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9545" podStartSLOduration=161.617904055 podStartE2EDuration="2m41.617904055s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:39.530413531 +0000 UTC m=+204.560614283" watchObservedRunningTime="2026-03-09 09:10:39.617904055 +0000 UTC m=+204.648104807" Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.685049 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.685474 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:40.185457165 +0000 UTC m=+205.215657917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.762295 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-p5kgd" podStartSLOduration=161.762270287 podStartE2EDuration="2m41.762270287s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:39.760544937 +0000 UTC m=+204.790745689" watchObservedRunningTime="2026-03-09 09:10:39.762270287 +0000 UTC m=+204.792471039" Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.786411 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.786651 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:40.286611368 +0000 UTC m=+205.316812120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.786715 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.787170 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:40.287154263 +0000 UTC m=+205.317355015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.887595 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.887845 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:40.387809782 +0000 UTC m=+205.418010534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.887991 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.888363 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:40.388345008 +0000 UTC m=+205.418545760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.976712 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8hkmv"] Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.978152 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:10:39 crc kubenswrapper[4792]: I0309 09:10:39.988809 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:39 crc kubenswrapper[4792]: E0309 09:10:39.989230 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:40.489211933 +0000 UTC m=+205.519412685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:40 crc kubenswrapper[4792]: W0309 09:10:40.000688 4792 reflector.go:561] object-"openshift-marketplace"/"community-operators-dockercfg-dmngl": failed to list *v1.Secret: secrets "community-operators-dockercfg-dmngl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Mar 09 09:10:40 crc kubenswrapper[4792]: E0309 09:10:40.000751 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"community-operators-dockercfg-dmngl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"community-operators-dockercfg-dmngl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.004021 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nz8fs" podStartSLOduration=162.004001973 podStartE2EDuration="2m42.004001973s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:39.944890094 +0000 UTC m=+204.975090846" watchObservedRunningTime="2026-03-09 09:10:40.004001973 +0000 UTC m=+205.034202725" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.049540 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8kksk" event={"ID":"73189c5a-2649-4a99-9f15-d0e7cddea5ea","Type":"ContainerStarted","Data":"44f835eda46bd5c363a3b995b9dfd5d8013a716be151bb7c71016bca59c45b26"} Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.053838 4792 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-49k84 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.053908 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" podUID="d3c17466-62af-4a4b-bc43-a4eda5d974dd" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.078401 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hkmv"] Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.095005 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.095196 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsfpd\" (UniqueName: \"kubernetes.io/projected/047810b2-277c-4d4c-822a-98b6d2a91fcc-kube-api-access-bsfpd\") pod \"community-operators-8hkmv\" (UID: \"047810b2-277c-4d4c-822a-98b6d2a91fcc\") " pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.095293 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047810b2-277c-4d4c-822a-98b6d2a91fcc-catalog-content\") pod \"community-operators-8hkmv\" (UID: \"047810b2-277c-4d4c-822a-98b6d2a91fcc\") " pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.095673 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047810b2-277c-4d4c-822a-98b6d2a91fcc-utilities\") pod \"community-operators-8hkmv\" (UID: \"047810b2-277c-4d4c-822a-98b6d2a91fcc\") " pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:10:40 crc kubenswrapper[4792]: E0309 09:10:40.097246 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:40.597225981 +0000 UTC m=+205.627426733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.099097 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t2jhd"] Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.100380 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.119125 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.129091 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4pzr2" podStartSLOduration=11.129044634 podStartE2EDuration="11.129044634s" podCreationTimestamp="2026-03-09 09:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:40.128226181 +0000 UTC m=+205.158426933" watchObservedRunningTime="2026-03-09 09:10:40.129044634 +0000 UTC m=+205.159245376" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.143132 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t2jhd"] Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.179936 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wtkcv"] Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.181083 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.196709 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.197127 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047810b2-277c-4d4c-822a-98b6d2a91fcc-utilities\") pod \"community-operators-8hkmv\" (UID: \"047810b2-277c-4d4c-822a-98b6d2a91fcc\") " pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.197169 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69535a14-c11d-442a-837d-f1d6744cb530-utilities\") pod \"certified-operators-t2jhd\" (UID: \"69535a14-c11d-442a-837d-f1d6744cb530\") " pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:10:40 crc kubenswrapper[4792]: E0309 09:10:40.197220 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:40.69718946 +0000 UTC m=+205.727390212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.197269 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lq57\" (UniqueName: \"kubernetes.io/projected/69535a14-c11d-442a-837d-f1d6744cb530-kube-api-access-4lq57\") pod \"certified-operators-t2jhd\" (UID: \"69535a14-c11d-442a-837d-f1d6744cb530\") " pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.197342 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.197687 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsfpd\" (UniqueName: \"kubernetes.io/projected/047810b2-277c-4d4c-822a-98b6d2a91fcc-kube-api-access-bsfpd\") pod \"community-operators-8hkmv\" (UID: \"047810b2-277c-4d4c-822a-98b6d2a91fcc\") " pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.197768 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047810b2-277c-4d4c-822a-98b6d2a91fcc-utilities\") pod \"community-operators-8hkmv\" (UID: \"047810b2-277c-4d4c-822a-98b6d2a91fcc\") " pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.197785 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047810b2-277c-4d4c-822a-98b6d2a91fcc-catalog-content\") pod \"community-operators-8hkmv\" (UID: \"047810b2-277c-4d4c-822a-98b6d2a91fcc\") " pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.197829 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69535a14-c11d-442a-837d-f1d6744cb530-catalog-content\") pod \"certified-operators-t2jhd\" (UID: \"69535a14-c11d-442a-837d-f1d6744cb530\") " pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:10:40 crc kubenswrapper[4792]: E0309 09:10:40.198101 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:40.698061015 +0000 UTC m=+205.728261767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.198490 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047810b2-277c-4d4c-822a-98b6d2a91fcc-catalog-content\") pod \"community-operators-8hkmv\" (UID: \"047810b2-277c-4d4c-822a-98b6d2a91fcc\") " pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.235936 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wtkcv"] Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.289435 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsfpd\" (UniqueName: \"kubernetes.io/projected/047810b2-277c-4d4c-822a-98b6d2a91fcc-kube-api-access-bsfpd\") pod \"community-operators-8hkmv\" (UID: \"047810b2-277c-4d4c-822a-98b6d2a91fcc\") " pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.299985 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.300259 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhlfl\" (UniqueName: \"kubernetes.io/projected/c0500b46-411e-4371-83ae-1b148bf65ba9-kube-api-access-rhlfl\") pod \"community-operators-wtkcv\" (UID: \"c0500b46-411e-4371-83ae-1b148bf65ba9\") " pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.300296 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69535a14-c11d-442a-837d-f1d6744cb530-utilities\") pod \"certified-operators-t2jhd\" (UID: \"69535a14-c11d-442a-837d-f1d6744cb530\") " pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.300319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lq57\" (UniqueName: \"kubernetes.io/projected/69535a14-c11d-442a-837d-f1d6744cb530-kube-api-access-4lq57\") pod \"certified-operators-t2jhd\" (UID: \"69535a14-c11d-442a-837d-f1d6744cb530\") " pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.300378 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0500b46-411e-4371-83ae-1b148bf65ba9-utilities\") pod \"community-operators-wtkcv\" (UID: \"c0500b46-411e-4371-83ae-1b148bf65ba9\") " pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.300416 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0500b46-411e-4371-83ae-1b148bf65ba9-catalog-content\") pod \"community-operators-wtkcv\" (UID: \"c0500b46-411e-4371-83ae-1b148bf65ba9\") " pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.300437 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69535a14-c11d-442a-837d-f1d6744cb530-catalog-content\") pod \"certified-operators-t2jhd\" (UID: \"69535a14-c11d-442a-837d-f1d6744cb530\") " pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.300834 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69535a14-c11d-442a-837d-f1d6744cb530-catalog-content\") pod \"certified-operators-t2jhd\" (UID: \"69535a14-c11d-442a-837d-f1d6744cb530\") " pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:10:40 crc kubenswrapper[4792]: E0309 09:10:40.300914 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:40.800896466 +0000 UTC m=+205.831097208 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.301233 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69535a14-c11d-442a-837d-f1d6744cb530-utilities\") pod \"certified-operators-t2jhd\" (UID: \"69535a14-c11d-442a-837d-f1d6744cb530\") " pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.305000 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7p469" podStartSLOduration=162.304964701 podStartE2EDuration="2m42.304964701s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:40.286696863 +0000 UTC m=+205.316897615" watchObservedRunningTime="2026-03-09 09:10:40.304964701 +0000 UTC m=+205.335165453" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.391467 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2d7nf"] Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.393620 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.409639 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twjdp\" (UniqueName: \"kubernetes.io/projected/50c5e5dd-62cf-470c-a626-27cca12c69fb-kube-api-access-twjdp\") pod \"certified-operators-2d7nf\" (UID: \"50c5e5dd-62cf-470c-a626-27cca12c69fb\") " pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.409685 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c5e5dd-62cf-470c-a626-27cca12c69fb-utilities\") pod \"certified-operators-2d7nf\" (UID: \"50c5e5dd-62cf-470c-a626-27cca12c69fb\") " pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.409719 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhlfl\" (UniqueName: \"kubernetes.io/projected/c0500b46-411e-4371-83ae-1b148bf65ba9-kube-api-access-rhlfl\") pod \"community-operators-wtkcv\" (UID: \"c0500b46-411e-4371-83ae-1b148bf65ba9\") " pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.409770 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.409797 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c5e5dd-62cf-470c-a626-27cca12c69fb-catalog-content\") pod \"certified-operators-2d7nf\" (UID: \"50c5e5dd-62cf-470c-a626-27cca12c69fb\") " pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.409855 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0500b46-411e-4371-83ae-1b148bf65ba9-utilities\") pod \"community-operators-wtkcv\" (UID: \"c0500b46-411e-4371-83ae-1b148bf65ba9\") " pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.409896 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0500b46-411e-4371-83ae-1b148bf65ba9-catalog-content\") pod \"community-operators-wtkcv\" (UID: \"c0500b46-411e-4371-83ae-1b148bf65ba9\") " pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.410441 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0500b46-411e-4371-83ae-1b148bf65ba9-catalog-content\") pod \"community-operators-wtkcv\" (UID: \"c0500b46-411e-4371-83ae-1b148bf65ba9\") " pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:10:40 crc kubenswrapper[4792]: E0309 09:10:40.411089 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:40.911059965 +0000 UTC m=+205.941260717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.411361 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0500b46-411e-4371-83ae-1b148bf65ba9-utilities\") pod \"community-operators-wtkcv\" (UID: \"c0500b46-411e-4371-83ae-1b148bf65ba9\") " pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.424312 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" podStartSLOduration=162.424286791 podStartE2EDuration="2m42.424286791s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:40.419431153 +0000 UTC m=+205.449631905" watchObservedRunningTime="2026-03-09 09:10:40.424286791 +0000 UTC m=+205.454487543" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.494902 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2d7nf"] Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.496189 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhlfl\" (UniqueName: \"kubernetes.io/projected/c0500b46-411e-4371-83ae-1b148bf65ba9-kube-api-access-rhlfl\") pod \"community-operators-wtkcv\" (UID: \"c0500b46-411e-4371-83ae-1b148bf65ba9\") " pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.516960 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:40 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:40 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:40 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.517033 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.518282 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.519154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c5e5dd-62cf-470c-a626-27cca12c69fb-catalog-content\") pod \"certified-operators-2d7nf\" (UID: \"50c5e5dd-62cf-470c-a626-27cca12c69fb\") " pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.519272 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twjdp\" (UniqueName: \"kubernetes.io/projected/50c5e5dd-62cf-470c-a626-27cca12c69fb-kube-api-access-twjdp\") pod \"certified-operators-2d7nf\" (UID: \"50c5e5dd-62cf-470c-a626-27cca12c69fb\") " pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.519302 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c5e5dd-62cf-470c-a626-27cca12c69fb-utilities\") pod \"certified-operators-2d7nf\" (UID: \"50c5e5dd-62cf-470c-a626-27cca12c69fb\") " pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.520373 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c5e5dd-62cf-470c-a626-27cca12c69fb-utilities\") pod \"certified-operators-2d7nf\" (UID: \"50c5e5dd-62cf-470c-a626-27cca12c69fb\") " pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:10:40 crc kubenswrapper[4792]: E0309 09:10:40.520473 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:41.020453622 +0000 UTC m=+206.050654374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.520694 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c5e5dd-62cf-470c-a626-27cca12c69fb-catalog-content\") pod \"certified-operators-2d7nf\" (UID: \"50c5e5dd-62cf-470c-a626-27cca12c69fb\") " pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.522830 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lq57\" (UniqueName: \"kubernetes.io/projected/69535a14-c11d-442a-837d-f1d6744cb530-kube-api-access-4lq57\") pod \"certified-operators-t2jhd\" (UID: \"69535a14-c11d-442a-837d-f1d6744cb530\") " pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.596031 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twjdp\" (UniqueName: \"kubernetes.io/projected/50c5e5dd-62cf-470c-a626-27cca12c69fb-kube-api-access-twjdp\") pod \"certified-operators-2d7nf\" (UID: \"50c5e5dd-62cf-470c-a626-27cca12c69fb\") " pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.606203 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-m7bwm" podStartSLOduration=162.606175057 podStartE2EDuration="2m42.606175057s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:40.516609853 +0000 UTC m=+205.546810605" watchObservedRunningTime="2026-03-09 09:10:40.606175057 +0000 UTC m=+205.636375809" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.626031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:40 crc kubenswrapper[4792]: E0309 09:10:40.626436 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:41.126420543 +0000 UTC m=+206.156621295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.700141 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" podStartSLOduration=162.700119616 podStartE2EDuration="2m42.700119616s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:40.62741649 +0000 UTC m=+205.657617252" watchObservedRunningTime="2026-03-09 09:10:40.700119616 +0000 UTC m=+205.730320368" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.721349 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.728015 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:40 crc kubenswrapper[4792]: E0309 09:10:40.729026 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:41.229000906 +0000 UTC m=+206.259201658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.832945 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:40 crc kubenswrapper[4792]: E0309 09:10:40.833360 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:41.333347231 +0000 UTC m=+206.363547983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.851750 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.937112 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.938158 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:40 crc kubenswrapper[4792]: E0309 09:10:40.938445 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:41.438432495 +0000 UTC m=+206.468633247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.938525 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:10:40 crc kubenswrapper[4792]: I0309 09:10:40.943156 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.039167 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:41 crc kubenswrapper[4792]: E0309 09:10:41.039623 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:41.539608739 +0000 UTC m=+206.569809481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.078993 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kmnjq" podStartSLOduration=163.078974517 podStartE2EDuration="2m43.078974517s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:40.890458352 +0000 UTC m=+205.920659104" watchObservedRunningTime="2026-03-09 09:10:41.078974517 +0000 UTC m=+206.109175259" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.080144 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.080217 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.094054 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8kksk" event={"ID":"73189c5a-2649-4a99-9f15-d0e7cddea5ea","Type":"ContainerStarted","Data":"01f0704dbc17d245694efab2f0fed39a864cdecd12858ea5f4db363182bdb25e"} Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.142645 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:41 crc kubenswrapper[4792]: E0309 09:10:41.144167 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:41.644135488 +0000 UTC m=+206.674336240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.160894 4792 ???:1] "http: TLS handshake error from 192.168.126.11:44874: no serving certificate available for the kubelet" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.244659 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:41 crc kubenswrapper[4792]: E0309 09:10:41.245048 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:41.745032845 +0000 UTC m=+206.775233597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.288958 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2mjvb" podStartSLOduration=163.288938101 podStartE2EDuration="2m43.288938101s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:41.286647726 +0000 UTC m=+206.316848478" watchObservedRunningTime="2026-03-09 09:10:41.288938101 +0000 UTC m=+206.319138853" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.290319 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zsj86" podStartSLOduration=164.29031289 podStartE2EDuration="2m44.29031289s" podCreationTimestamp="2026-03-09 09:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:41.199021287 +0000 UTC m=+206.229222039" watchObservedRunningTime="2026-03-09 09:10:41.29031289 +0000 UTC m=+206.320513642" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.346321 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:41 crc kubenswrapper[4792]: E0309 09:10:41.346619 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:41.846600369 +0000 UTC m=+206.876801121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.449916 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:41 crc kubenswrapper[4792]: E0309 09:10:41.450406 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:41.950387237 +0000 UTC m=+206.980587989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.495363 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.501224 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:41 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:41 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:41 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.501269 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.527170 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-9ch2w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.527230 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-9ch2w" podUID="f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.527165 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-9ch2w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.527294 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9ch2w" podUID="f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.528376 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" podStartSLOduration=164.528363222 podStartE2EDuration="2m44.528363222s" podCreationTimestamp="2026-03-09 09:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:41.524714318 +0000 UTC m=+206.554915060" watchObservedRunningTime="2026-03-09 09:10:41.528363222 +0000 UTC m=+206.558563974" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.551727 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:41 crc kubenswrapper[4792]: E0309 09:10:41.552103 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:42.052086526 +0000 UTC m=+207.082287278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.629090 4792 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-49k84 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.629151 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" podUID="d3c17466-62af-4a4b-bc43-a4eda5d974dd" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.629345 4792 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-49k84 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.629375 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" podUID="d3c17466-62af-4a4b-bc43-a4eda5d974dd" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.653968 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:41 crc kubenswrapper[4792]: E0309 09:10:41.655659 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:42.155637787 +0000 UTC m=+207.185838539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.679023 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.679103 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.685907 4792 patch_prober.go:28] interesting pod/console-f9d7485db-jh5pl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.685971 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jh5pl" podUID="894f7c69-0119-4c19-b205-9780fb52b06e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.743792 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" podStartSLOduration=163.743771781 podStartE2EDuration="2m43.743771781s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:41.743568895 +0000 UTC m=+206.773769647" watchObservedRunningTime="2026-03-09 09:10:41.743771781 +0000 UTC m=+206.773972533" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.756104 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:41 crc kubenswrapper[4792]: E0309 09:10:41.756703 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:42.256677667 +0000 UTC m=+207.286878429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.826727 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.827462 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.858938 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:41 crc kubenswrapper[4792]: E0309 09:10:41.862544 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:42.362514194 +0000 UTC m=+207.392714946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.884804 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.923861 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8wm7x"] Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.925106 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.952757 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.954475 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wm7x"] Mar 09 09:10:41 crc kubenswrapper[4792]: I0309 09:10:41.960670 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:41 crc kubenswrapper[4792]: E0309 09:10:41.961470 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:42.461437173 +0000 UTC m=+207.491637925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.062050 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8606aa7f-7b07-40df-b9b8-f415a5e68b47-catalog-content\") pod \"redhat-marketplace-8wm7x\" (UID: \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\") " pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.062113 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8606aa7f-7b07-40df-b9b8-f415a5e68b47-utilities\") pod \"redhat-marketplace-8wm7x\" (UID: \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\") " pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.062181 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.062260 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdjsz\" (UniqueName: \"kubernetes.io/projected/8606aa7f-7b07-40df-b9b8-f415a5e68b47-kube-api-access-mdjsz\") pod \"redhat-marketplace-8wm7x\" (UID: \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\") " pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:10:42 crc kubenswrapper[4792]: E0309 09:10:42.062657 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:42.562641938 +0000 UTC m=+207.592842690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.120214 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.152338 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-8kksk" event={"ID":"73189c5a-2649-4a99-9f15-d0e7cddea5ea","Type":"ContainerStarted","Data":"dba536d172944bad4ebf070f2dc6f8ca5f590f09ef7ec55f4a9da4f28c2ab1cb"} Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.162961 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.163328 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdjsz\" (UniqueName: \"kubernetes.io/projected/8606aa7f-7b07-40df-b9b8-f415a5e68b47-kube-api-access-mdjsz\") pod \"redhat-marketplace-8wm7x\" (UID: \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\") " pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.163383 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8606aa7f-7b07-40df-b9b8-f415a5e68b47-catalog-content\") pod \"redhat-marketplace-8wm7x\" (UID: \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\") " pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.163405 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8606aa7f-7b07-40df-b9b8-f415a5e68b47-utilities\") pod \"redhat-marketplace-8wm7x\" (UID: \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\") " pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.163987 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8606aa7f-7b07-40df-b9b8-f415a5e68b47-utilities\") pod \"redhat-marketplace-8wm7x\" (UID: \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\") " pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:10:42 crc kubenswrapper[4792]: E0309 09:10:42.164114 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:42.6640972 +0000 UTC m=+207.694297952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.164619 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8606aa7f-7b07-40df-b9b8-f415a5e68b47-catalog-content\") pod \"redhat-marketplace-8wm7x\" (UID: \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\") " pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.180083 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ngdp7" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.264421 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:42 crc kubenswrapper[4792]: E0309 09:10:42.266651 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:42.766639142 +0000 UTC m=+207.796839894 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.318320 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-8kksk" podStartSLOduration=13.31829509 podStartE2EDuration="13.31829509s" podCreationTimestamp="2026-03-09 09:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:42.314637695 +0000 UTC m=+207.344838447" watchObservedRunningTime="2026-03-09 09:10:42.31829509 +0000 UTC m=+207.348495842" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.365859 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:42 crc kubenswrapper[4792]: E0309 09:10:42.366246 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:42.866230461 +0000 UTC m=+207.896431213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.366300 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mj542"] Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.367865 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.469625 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19c063e-191a-487c-a491-0af8c6fc1e3f-utilities\") pod \"redhat-marketplace-mj542\" (UID: \"a19c063e-191a-487c-a491-0af8c6fc1e3f\") " pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.469716 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19c063e-191a-487c-a491-0af8c6fc1e3f-catalog-content\") pod \"redhat-marketplace-mj542\" (UID: \"a19c063e-191a-487c-a491-0af8c6fc1e3f\") " pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.469757 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2brw5\" (UniqueName: \"kubernetes.io/projected/a19c063e-191a-487c-a491-0af8c6fc1e3f-kube-api-access-2brw5\") pod \"redhat-marketplace-mj542\" (UID: \"a19c063e-191a-487c-a491-0af8c6fc1e3f\") " pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.469796 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:42 crc kubenswrapper[4792]: E0309 09:10:42.470178 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:42.970164864 +0000 UTC m=+208.000365616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.478212 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mj542"] Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.492988 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:42 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:42 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:42 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.493048 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.570924 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.571315 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19c063e-191a-487c-a491-0af8c6fc1e3f-catalog-content\") pod \"redhat-marketplace-mj542\" (UID: \"a19c063e-191a-487c-a491-0af8c6fc1e3f\") " pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.571378 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2brw5\" (UniqueName: \"kubernetes.io/projected/a19c063e-191a-487c-a491-0af8c6fc1e3f-kube-api-access-2brw5\") pod \"redhat-marketplace-mj542\" (UID: \"a19c063e-191a-487c-a491-0af8c6fc1e3f\") " pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.571665 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19c063e-191a-487c-a491-0af8c6fc1e3f-utilities\") pod \"redhat-marketplace-mj542\" (UID: \"a19c063e-191a-487c-a491-0af8c6fc1e3f\") " pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.572896 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19c063e-191a-487c-a491-0af8c6fc1e3f-utilities\") pod \"redhat-marketplace-mj542\" (UID: \"a19c063e-191a-487c-a491-0af8c6fc1e3f\") " pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:10:42 crc kubenswrapper[4792]: E0309 09:10:42.573097 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:43.073057446 +0000 UTC m=+208.103258198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.573377 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19c063e-191a-487c-a491-0af8c6fc1e3f-catalog-content\") pod \"redhat-marketplace-mj542\" (UID: \"a19c063e-191a-487c-a491-0af8c6fc1e3f\") " pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.642497 4792 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.658428 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2brw5\" (UniqueName: \"kubernetes.io/projected/a19c063e-191a-487c-a491-0af8c6fc1e3f-kube-api-access-2brw5\") pod \"redhat-marketplace-mj542\" (UID: \"a19c063e-191a-487c-a491-0af8c6fc1e3f\") " pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.673238 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:42 crc kubenswrapper[4792]: E0309 09:10:42.673687 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:43.173669454 +0000 UTC m=+208.203870206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.716802 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.774296 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:42 crc kubenswrapper[4792]: E0309 09:10:42.774626 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:43.274609751 +0000 UTC m=+208.304810503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.774884 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:42 crc kubenswrapper[4792]: E0309 09:10:42.775229 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:10:43.275222608 +0000 UTC m=+208.305423360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kkrgv" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.852086 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hkmv"] Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.875798 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:42 crc kubenswrapper[4792]: E0309 09:10:42.876420 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:10:43.376399373 +0000 UTC m=+208.406600125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.902489 4792 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-09T09:10:42.64253884Z","Handler":null,"Name":""} Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.945414 4792 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.945470 4792 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 09 09:10:42 crc kubenswrapper[4792]: I0309 09:10:42.978906 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.000141 4792 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.000185 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.003836 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jsnbn"] Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.004949 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.033625 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ncpc5"] Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.034191 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" podUID="0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c" containerName="controller-manager" containerID="cri-o://a1b2308ae9ee926601eb462bfc9a895cf553904e21f338d888a164d495f8977d" gracePeriod=30 Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.055109 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.084527 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0667075c-38b7-4fb6-ad69-a31987eae3cc-catalog-content\") pod \"redhat-operators-jsnbn\" (UID: \"0667075c-38b7-4fb6-ad69-a31987eae3cc\") " pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.084619 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2j2c\" (UniqueName: \"kubernetes.io/projected/0667075c-38b7-4fb6-ad69-a31987eae3cc-kube-api-access-m2j2c\") pod \"redhat-operators-jsnbn\" (UID: \"0667075c-38b7-4fb6-ad69-a31987eae3cc\") " pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.084650 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0667075c-38b7-4fb6-ad69-a31987eae3cc-utilities\") pod \"redhat-operators-jsnbn\" (UID: \"0667075c-38b7-4fb6-ad69-a31987eae3cc\") " pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.084778 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t2jhd"] Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.172318 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jsnbn"] Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.191834 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0667075c-38b7-4fb6-ad69-a31987eae3cc-catalog-content\") pod \"redhat-operators-jsnbn\" (UID: \"0667075c-38b7-4fb6-ad69-a31987eae3cc\") " pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.191950 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2j2c\" (UniqueName: \"kubernetes.io/projected/0667075c-38b7-4fb6-ad69-a31987eae3cc-kube-api-access-m2j2c\") pod \"redhat-operators-jsnbn\" (UID: \"0667075c-38b7-4fb6-ad69-a31987eae3cc\") " pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.191983 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0667075c-38b7-4fb6-ad69-a31987eae3cc-utilities\") pod \"redhat-operators-jsnbn\" (UID: \"0667075c-38b7-4fb6-ad69-a31987eae3cc\") " pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.192445 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0667075c-38b7-4fb6-ad69-a31987eae3cc-utilities\") pod \"redhat-operators-jsnbn\" (UID: \"0667075c-38b7-4fb6-ad69-a31987eae3cc\") " pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.192673 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0667075c-38b7-4fb6-ad69-a31987eae3cc-catalog-content\") pod \"redhat-operators-jsnbn\" (UID: \"0667075c-38b7-4fb6-ad69-a31987eae3cc\") " pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.205823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdjsz\" (UniqueName: \"kubernetes.io/projected/8606aa7f-7b07-40df-b9b8-f415a5e68b47-kube-api-access-mdjsz\") pod \"redhat-marketplace-8wm7x\" (UID: \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\") " pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.214824 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j"] Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.215103 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" podUID="ea625b31-78ac-4c2b-8f73-3e5c74894fce" containerName="route-controller-manager" containerID="cri-o://36134bcc4edd8cefaf2ef68607c266ba37a2d30ce05b79f1ac713d688301ec95" gracePeriod=30 Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.217552 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2jhd" event={"ID":"69535a14-c11d-442a-837d-f1d6744cb530","Type":"ContainerStarted","Data":"c1f9edc92ab1748cf13549f8902d5a0ea8ba2420f2858e6080f6aafff25ab893"} Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.217704 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.217749 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.244438 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hkmv" event={"ID":"047810b2-277c-4d4c-822a-98b6d2a91fcc","Type":"ContainerStarted","Data":"8bd2b905a42ac2a7914349acf7483e6a96b627bc84262d46412438104a33559b"} Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.244489 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2d7nf"] Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.247700 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c4nwc"] Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.255375 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.275249 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4nwc"] Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.339659 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2j2c\" (UniqueName: \"kubernetes.io/projected/0667075c-38b7-4fb6-ad69-a31987eae3cc-kube-api-access-m2j2c\") pod \"redhat-operators-jsnbn\" (UID: \"0667075c-38b7-4fb6-ad69-a31987eae3cc\") " pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.371344 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.399574 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.401284 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-catalog-content\") pod \"redhat-operators-c4nwc\" (UID: \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\") " pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.401429 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64msg\" (UniqueName: \"kubernetes.io/projected/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-kube-api-access-64msg\") pod \"redhat-operators-c4nwc\" (UID: \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\") " pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.401484 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-utilities\") pod \"redhat-operators-c4nwc\" (UID: \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\") " pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.413141 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wtkcv"] Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.461181 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.497809 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:43 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:43 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:43 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.497866 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.512728 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64msg\" (UniqueName: \"kubernetes.io/projected/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-kube-api-access-64msg\") pod \"redhat-operators-c4nwc\" (UID: \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\") " pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.512793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-utilities\") pod \"redhat-operators-c4nwc\" (UID: \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\") " pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.512984 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-catalog-content\") pod \"redhat-operators-c4nwc\" (UID: \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\") " pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.514255 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-utilities\") pod \"redhat-operators-c4nwc\" (UID: \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\") " pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.515187 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-catalog-content\") pod \"redhat-operators-c4nwc\" (UID: \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\") " pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.556904 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kkrgv\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.575351 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64msg\" (UniqueName: \"kubernetes.io/projected/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-kube-api-access-64msg\") pod \"redhat-operators-c4nwc\" (UID: \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\") " pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:10:43 crc kubenswrapper[4792]: W0309 09:10:43.602429 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0500b46_411e_4371_83ae_1b148bf65ba9.slice/crio-12ddd27962373accb9db70cd2705c95b3bf27cb10beabc533c155430679fbeaa WatchSource:0}: Error finding container 12ddd27962373accb9db70cd2705c95b3bf27cb10beabc533c155430679fbeaa: Status 404 returned error can't find the container with id 12ddd27962373accb9db70cd2705c95b3bf27cb10beabc533c155430679fbeaa Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.614792 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.638617 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.642036 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-49k84" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.672460 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.691000 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.716104 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.716851 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.726418 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.767506 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.784244 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.818481 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df3f43f0-0799-480e-9d12-e4132e44cb11-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df3f43f0-0799-480e-9d12-e4132e44cb11\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.818703 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df3f43f0-0799-480e-9d12-e4132e44cb11-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df3f43f0-0799-480e-9d12-e4132e44cb11\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.857057 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.921782 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df3f43f0-0799-480e-9d12-e4132e44cb11-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df3f43f0-0799-480e-9d12-e4132e44cb11\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.921860 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df3f43f0-0799-480e-9d12-e4132e44cb11-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df3f43f0-0799-480e-9d12-e4132e44cb11\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.922356 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df3f43f0-0799-480e-9d12-e4132e44cb11-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df3f43f0-0799-480e-9d12-e4132e44cb11\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:10:43 crc kubenswrapper[4792]: I0309 09:10:43.923534 4792 ???:1] "http: TLS handshake error from 192.168.126.11:55610: no serving certificate available for the kubelet" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.060277 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df3f43f0-0799-480e-9d12-e4132e44cb11-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df3f43f0-0799-480e-9d12-e4132e44cb11\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.102384 4792 patch_prober.go:28] interesting pod/apiserver-76f77b778f-l8jxj container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 09 09:10:44 crc kubenswrapper[4792]: [+]log ok Mar 09 09:10:44 crc kubenswrapper[4792]: [+]etcd ok Mar 09 09:10:44 crc kubenswrapper[4792]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 09 09:10:44 crc kubenswrapper[4792]: [+]poststarthook/generic-apiserver-start-informers ok Mar 09 09:10:44 crc kubenswrapper[4792]: [+]poststarthook/max-in-flight-filter ok Mar 09 09:10:44 crc kubenswrapper[4792]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 09 09:10:44 crc kubenswrapper[4792]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 09 09:10:44 crc kubenswrapper[4792]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 09 09:10:44 crc kubenswrapper[4792]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 09 09:10:44 crc kubenswrapper[4792]: [+]poststarthook/project.openshift.io-projectcache ok Mar 09 09:10:44 crc kubenswrapper[4792]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 09 09:10:44 crc kubenswrapper[4792]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Mar 09 09:10:44 crc kubenswrapper[4792]: [-]poststarthook/openshift.io-restmapperupdater failed: reason withheld Mar 09 09:10:44 crc kubenswrapper[4792]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 09 09:10:44 crc kubenswrapper[4792]: livez check failed Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.102842 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" podUID="62e49d0b-dfc1-48ab-bed7-7ba7fe8a4475" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.155416 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mj542"] Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.167538 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.291967 4792 generic.go:334] "Generic (PLEG): container finished" podID="0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c" containerID="a1b2308ae9ee926601eb462bfc9a895cf553904e21f338d888a164d495f8977d" exitCode=0 Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.292044 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" event={"ID":"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c","Type":"ContainerDied","Data":"a1b2308ae9ee926601eb462bfc9a895cf553904e21f338d888a164d495f8977d"} Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.302413 4792 generic.go:334] "Generic (PLEG): container finished" podID="ea625b31-78ac-4c2b-8f73-3e5c74894fce" containerID="36134bcc4edd8cefaf2ef68607c266ba37a2d30ce05b79f1ac713d688301ec95" exitCode=0 Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.302864 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" event={"ID":"ea625b31-78ac-4c2b-8f73-3e5c74894fce","Type":"ContainerDied","Data":"36134bcc4edd8cefaf2ef68607c266ba37a2d30ce05b79f1ac713d688301ec95"} Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.327427 4792 generic.go:334] "Generic (PLEG): container finished" podID="50c5e5dd-62cf-470c-a626-27cca12c69fb" containerID="9fdca09e9f110fb0bf660bc1be638e39f2a36e2043b74d947175a185f7cdd292" exitCode=0 Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.328323 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d7nf" event={"ID":"50c5e5dd-62cf-470c-a626-27cca12c69fb","Type":"ContainerDied","Data":"9fdca09e9f110fb0bf660bc1be638e39f2a36e2043b74d947175a185f7cdd292"} Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.328356 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d7nf" event={"ID":"50c5e5dd-62cf-470c-a626-27cca12c69fb","Type":"ContainerStarted","Data":"25e168d0cb6ce00f938ad43401b9b7045e4c307aba5fff2ba079da25d5de36e8"} Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.359741 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtkcv" event={"ID":"c0500b46-411e-4371-83ae-1b148bf65ba9","Type":"ContainerStarted","Data":"12ddd27962373accb9db70cd2705c95b3bf27cb10beabc533c155430679fbeaa"} Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.372886 4792 generic.go:334] "Generic (PLEG): container finished" podID="69535a14-c11d-442a-837d-f1d6744cb530" containerID="5e9a4dcf83e7458f8f8bbefb6ed7b2354b23971b3aac7969732fc50d8fa4d880" exitCode=0 Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.372975 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2jhd" event={"ID":"69535a14-c11d-442a-837d-f1d6744cb530","Type":"ContainerDied","Data":"5e9a4dcf83e7458f8f8bbefb6ed7b2354b23971b3aac7969732fc50d8fa4d880"} Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.405601 4792 generic.go:334] "Generic (PLEG): container finished" podID="047810b2-277c-4d4c-822a-98b6d2a91fcc" containerID="673e39c1476b6f6837498d128df6d61369642de88ed9d4adf2f1d891294db1d0" exitCode=0 Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.405708 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hkmv" event={"ID":"047810b2-277c-4d4c-822a-98b6d2a91fcc","Type":"ContainerDied","Data":"673e39c1476b6f6837498d128df6d61369642de88ed9d4adf2f1d891294db1d0"} Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.430313 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mj542" event={"ID":"a19c063e-191a-487c-a491-0af8c6fc1e3f","Type":"ContainerStarted","Data":"acaf5150deb5ad7326c3f17812ce48891e87a8ee247bcf7520ff1399033388d6"} Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.442668 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.443494 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.457793 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.459594 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.506189 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:44 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:44 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:44 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.506263 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.531403 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.551062 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53f92eb7-0267-41c3-aead-e2600d79a1b3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"53f92eb7-0267-41c3-aead-e2600d79a1b3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.551559 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53f92eb7-0267-41c3-aead-e2600d79a1b3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"53f92eb7-0267-41c3-aead-e2600d79a1b3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.653376 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53f92eb7-0267-41c3-aead-e2600d79a1b3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"53f92eb7-0267-41c3-aead-e2600d79a1b3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.653464 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53f92eb7-0267-41c3-aead-e2600d79a1b3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"53f92eb7-0267-41c3-aead-e2600d79a1b3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.653571 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53f92eb7-0267-41c3-aead-e2600d79a1b3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"53f92eb7-0267-41c3-aead-e2600d79a1b3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.722582 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53f92eb7-0267-41c3-aead-e2600d79a1b3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"53f92eb7-0267-41c3-aead-e2600d79a1b3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.780627 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.812233 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.929045 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wm7x"] Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.968661 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-serving-cert\") pod \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.968753 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-proxy-ca-bundles\") pod \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.968791 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrsxh\" (UniqueName: \"kubernetes.io/projected/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-kube-api-access-vrsxh\") pod \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.968844 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-config\") pod \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.968993 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-client-ca\") pod \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\" (UID: \"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c\") " Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.970522 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-client-ca" (OuterVolumeSpecName: "client-ca") pod "0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c" (UID: "0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.971626 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c" (UID: "0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.972275 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-config" (OuterVolumeSpecName: "config") pod "0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c" (UID: "0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.978571 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-kube-api-access-vrsxh" (OuterVolumeSpecName: "kube-api-access-vrsxh") pod "0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c" (UID: "0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c"). InnerVolumeSpecName "kube-api-access-vrsxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:10:44 crc kubenswrapper[4792]: I0309 09:10:44.979158 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c" (UID: "0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.071995 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.072036 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.072052 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.072085 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrsxh\" (UniqueName: \"kubernetes.io/projected/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-kube-api-access-vrsxh\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.072098 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.093107 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jsnbn"] Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.104774 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.207166 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.210699 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kkrgv"] Mar 09 09:10:45 crc kubenswrapper[4792]: W0309 09:10:45.275884 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddf3f43f0_0799_480e_9d12_e4132e44cb11.slice/crio-3d93d90d9fb41e3d0bfc2b90a06d1441d9e1d93b02b25f389d5569184264d4cf WatchSource:0}: Error finding container 3d93d90d9fb41e3d0bfc2b90a06d1441d9e1d93b02b25f389d5569184264d4cf: Status 404 returned error can't find the container with id 3d93d90d9fb41e3d0bfc2b90a06d1441d9e1d93b02b25f389d5569184264d4cf Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.279717 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea625b31-78ac-4c2b-8f73-3e5c74894fce-config\") pod \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.279772 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea625b31-78ac-4c2b-8f73-3e5c74894fce-serving-cert\") pod \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.279859 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k66nz\" (UniqueName: \"kubernetes.io/projected/ea625b31-78ac-4c2b-8f73-3e5c74894fce-kube-api-access-k66nz\") pod \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.279880 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea625b31-78ac-4c2b-8f73-3e5c74894fce-client-ca\") pod \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\" (UID: \"ea625b31-78ac-4c2b-8f73-3e5c74894fce\") " Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.283560 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea625b31-78ac-4c2b-8f73-3e5c74894fce-client-ca" (OuterVolumeSpecName: "client-ca") pod "ea625b31-78ac-4c2b-8f73-3e5c74894fce" (UID: "ea625b31-78ac-4c2b-8f73-3e5c74894fce"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.285989 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c4nwc"] Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.288244 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea625b31-78ac-4c2b-8f73-3e5c74894fce-config" (OuterVolumeSpecName: "config") pod "ea625b31-78ac-4c2b-8f73-3e5c74894fce" (UID: "ea625b31-78ac-4c2b-8f73-3e5c74894fce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.304497 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea625b31-78ac-4c2b-8f73-3e5c74894fce-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ea625b31-78ac-4c2b-8f73-3e5c74894fce" (UID: "ea625b31-78ac-4c2b-8f73-3e5c74894fce"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.349182 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea625b31-78ac-4c2b-8f73-3e5c74894fce-kube-api-access-k66nz" (OuterVolumeSpecName: "kube-api-access-k66nz") pod "ea625b31-78ac-4c2b-8f73-3e5c74894fce" (UID: "ea625b31-78ac-4c2b-8f73-3e5c74894fce"). InnerVolumeSpecName "kube-api-access-k66nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.361644 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 09:10:45 crc kubenswrapper[4792]: W0309 09:10:45.369467 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0aa86a9_9ed9_492f_ac14_43e14abf1f2c.slice/crio-f02596659545e20ef80e22b683a017f816b7ed2966e0fa65f832fca1d2d8b19a WatchSource:0}: Error finding container f02596659545e20ef80e22b683a017f816b7ed2966e0fa65f832fca1d2d8b19a: Status 404 returned error can't find the container with id f02596659545e20ef80e22b683a017f816b7ed2966e0fa65f832fca1d2d8b19a Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.384720 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea625b31-78ac-4c2b-8f73-3e5c74894fce-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.384746 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea625b31-78ac-4c2b-8f73-3e5c74894fce-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.384756 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k66nz\" (UniqueName: \"kubernetes.io/projected/ea625b31-78ac-4c2b-8f73-3e5c74894fce-kube-api-access-k66nz\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.384767 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea625b31-78ac-4c2b-8f73-3e5c74894fce-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.490746 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" event={"ID":"ea625b31-78ac-4c2b-8f73-3e5c74894fce","Type":"ContainerDied","Data":"5a931101a6e4300f456c349bd1a62314a1bb033b841ed1b33234504975c05fb3"} Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.491323 4792 scope.go:117] "RemoveContainer" containerID="36134bcc4edd8cefaf2ef68607c266ba37a2d30ce05b79f1ac713d688301ec95" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.491524 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.496447 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:45 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:45 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:45 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.496509 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.529933 4792 generic.go:334] "Generic (PLEG): container finished" podID="c0500b46-411e-4371-83ae-1b148bf65ba9" containerID="af22a1a4ae5b3367befd44d8a3b4d571775717fa86727e9398acdd0a48a10cde" exitCode=0 Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.530008 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtkcv" event={"ID":"c0500b46-411e-4371-83ae-1b148bf65ba9","Type":"ContainerDied","Data":"af22a1a4ae5b3367befd44d8a3b4d571775717fa86727e9398acdd0a48a10cde"} Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.543007 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsnbn" event={"ID":"0667075c-38b7-4fb6-ad69-a31987eae3cc","Type":"ContainerStarted","Data":"db3e667d578beaad49f5dd59c243f8ec8d7845f66db32a821bed09a6c5fc133c"} Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.546292 4792 generic.go:334] "Generic (PLEG): container finished" podID="a19c063e-191a-487c-a491-0af8c6fc1e3f" containerID="0b4bbfbcd3b525fe37edb82d16955bdb0246e5c49e433c4db4ea0b7dd859cdfa" exitCode=0 Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.546375 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mj542" event={"ID":"a19c063e-191a-487c-a491-0af8c6fc1e3f","Type":"ContainerDied","Data":"0b4bbfbcd3b525fe37edb82d16955bdb0246e5c49e433c4db4ea0b7dd859cdfa"} Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.548673 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4nwc" event={"ID":"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c","Type":"ContainerStarted","Data":"f02596659545e20ef80e22b683a017f816b7ed2966e0fa65f832fca1d2d8b19a"} Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.556608 4792 generic.go:334] "Generic (PLEG): container finished" podID="8606aa7f-7b07-40df-b9b8-f415a5e68b47" containerID="a1391a99fc45bb492441a99ff7814d38f3c9f1abb528e937ebeffaa002859319" exitCode=0 Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.557873 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wm7x" event={"ID":"8606aa7f-7b07-40df-b9b8-f415a5e68b47","Type":"ContainerDied","Data":"a1391a99fc45bb492441a99ff7814d38f3c9f1abb528e937ebeffaa002859319"} Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.557912 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wm7x" event={"ID":"8606aa7f-7b07-40df-b9b8-f415a5e68b47","Type":"ContainerStarted","Data":"1e448a8e07592f1f8b2e44bb7c61b0e7f69e72bcc9f3209b1c84e0e8f9f03744"} Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.569246 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.569401 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ncpc5" event={"ID":"0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c","Type":"ContainerDied","Data":"1e726d8f8ccf856eab5b946e783e8cb96a090b12291d43ce0ec6719be3bfc59f"} Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.570911 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"df3f43f0-0799-480e-9d12-e4132e44cb11","Type":"ContainerStarted","Data":"3d93d90d9fb41e3d0bfc2b90a06d1441d9e1d93b02b25f389d5569184264d4cf"} Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.587140 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j"] Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.588909 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" event={"ID":"e5b8826d-81fe-4d43-9177-33e8e34ca003","Type":"ContainerStarted","Data":"c05bac22bf2dc182e07d6c819987702fb7310ad04e1b8ed60575c583241eb583"} Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.591264 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-7f29j"] Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.598559 4792 scope.go:117] "RemoveContainer" containerID="a1b2308ae9ee926601eb462bfc9a895cf553904e21f338d888a164d495f8977d" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.617291 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z"] Mar 09 09:10:45 crc kubenswrapper[4792]: E0309 09:10:45.618432 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c" containerName="controller-manager" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.618620 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c" containerName="controller-manager" Mar 09 09:10:45 crc kubenswrapper[4792]: E0309 09:10:45.618806 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea625b31-78ac-4c2b-8f73-3e5c74894fce" containerName="route-controller-manager" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.619325 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea625b31-78ac-4c2b-8f73-3e5c74894fce" containerName="route-controller-manager" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.619557 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c" containerName="controller-manager" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.619722 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea625b31-78ac-4c2b-8f73-3e5c74894fce" containerName="route-controller-manager" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.620406 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8"] Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.621174 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.621426 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.623749 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.623947 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.630899 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z"] Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.645659 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8"] Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.629588 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.629629 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.629797 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.629894 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.633707 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.638805 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.638942 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.647826 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.648086 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.648209 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.662818 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.756409 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea625b31-78ac-4c2b-8f73-3e5c74894fce" path="/var/lib/kubelet/pods/ea625b31-78ac-4c2b-8f73-3e5c74894fce/volumes" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.791733 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-serving-cert\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.791789 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e96c8d-a866-49ce-82d7-106acf1c5f60-config\") pod \"route-controller-manager-956cf76db-zx46z\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.793607 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhbjm\" (UniqueName: \"kubernetes.io/projected/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-kube-api-access-bhbjm\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.793750 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26e96c8d-a866-49ce-82d7-106acf1c5f60-serving-cert\") pod \"route-controller-manager-956cf76db-zx46z\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.793772 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl8mw\" (UniqueName: \"kubernetes.io/projected/26e96c8d-a866-49ce-82d7-106acf1c5f60-kube-api-access-fl8mw\") pod \"route-controller-manager-956cf76db-zx46z\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.793801 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-config\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.793828 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-client-ca\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.793865 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-proxy-ca-bundles\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.793895 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26e96c8d-a866-49ce-82d7-106acf1c5f60-client-ca\") pod \"route-controller-manager-956cf76db-zx46z\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.895431 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26e96c8d-a866-49ce-82d7-106acf1c5f60-client-ca\") pod \"route-controller-manager-956cf76db-zx46z\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.896546 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26e96c8d-a866-49ce-82d7-106acf1c5f60-client-ca\") pod \"route-controller-manager-956cf76db-zx46z\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.897815 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-serving-cert\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.898085 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e96c8d-a866-49ce-82d7-106acf1c5f60-config\") pod \"route-controller-manager-956cf76db-zx46z\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.898413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhbjm\" (UniqueName: \"kubernetes.io/projected/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-kube-api-access-bhbjm\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.898573 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl8mw\" (UniqueName: \"kubernetes.io/projected/26e96c8d-a866-49ce-82d7-106acf1c5f60-kube-api-access-fl8mw\") pod \"route-controller-manager-956cf76db-zx46z\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.918939 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26e96c8d-a866-49ce-82d7-106acf1c5f60-serving-cert\") pod \"route-controller-manager-956cf76db-zx46z\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.919007 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-config\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.919042 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-client-ca\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.919103 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-proxy-ca-bundles\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.925967 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-proxy-ca-bundles\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.904981 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e96c8d-a866-49ce-82d7-106acf1c5f60-config\") pod \"route-controller-manager-956cf76db-zx46z\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.930530 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-serving-cert\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.930594 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26e96c8d-a866-49ce-82d7-106acf1c5f60-serving-cert\") pod \"route-controller-manager-956cf76db-zx46z\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.930603 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ncpc5"] Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.931276 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl8mw\" (UniqueName: \"kubernetes.io/projected/26e96c8d-a866-49ce-82d7-106acf1c5f60-kube-api-access-fl8mw\") pod \"route-controller-manager-956cf76db-zx46z\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.942331 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-client-ca\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.942604 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-config\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.951306 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ncpc5"] Mar 09 09:10:45 crc kubenswrapper[4792]: I0309 09:10:45.953297 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhbjm\" (UniqueName: \"kubernetes.io/projected/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-kube-api-access-bhbjm\") pod \"controller-manager-6d8f6f6dd5-trsw8\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.086484 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.116713 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-l8jxj" Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.125325 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.193670 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.498887 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:46 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:46 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:46 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.499398 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.644958 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" event={"ID":"e5b8826d-81fe-4d43-9177-33e8e34ca003","Type":"ContainerStarted","Data":"0cf5c49c1646fbd8845a08b2fbd7f0c2d61d3d205b399b69e8d659bbb23ed144"} Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.645904 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.677296 4792 generic.go:334] "Generic (PLEG): container finished" podID="0667075c-38b7-4fb6-ad69-a31987eae3cc" containerID="347d9ce1a7155a77f88a083b877b6c70fe58fb8197d05530eaf63170c0c44239" exitCode=0 Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.677436 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsnbn" event={"ID":"0667075c-38b7-4fb6-ad69-a31987eae3cc","Type":"ContainerDied","Data":"347d9ce1a7155a77f88a083b877b6c70fe58fb8197d05530eaf63170c0c44239"} Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.707900 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"53f92eb7-0267-41c3-aead-e2600d79a1b3","Type":"ContainerStarted","Data":"21e3b2ab7ec2456f29788a81b4a8eb648a426314f9da80ae3f079232d06c4e86"} Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.719651 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" podStartSLOduration=168.719625649 podStartE2EDuration="2m48.719625649s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:46.712013932 +0000 UTC m=+211.742214694" watchObservedRunningTime="2026-03-09 09:10:46.719625649 +0000 UTC m=+211.749826401" Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.731855 4792 generic.go:334] "Generic (PLEG): container finished" podID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" containerID="0052bf51352202144c3072e221ebc38403c2456047744c527662d8c2efa0ebb6" exitCode=0 Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.731926 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4nwc" event={"ID":"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c","Type":"ContainerDied","Data":"0052bf51352202144c3072e221ebc38403c2456047744c527662d8c2efa0ebb6"} Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.870137 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"df3f43f0-0799-480e-9d12-e4132e44cb11","Type":"ContainerStarted","Data":"4ab2a4b4555080d58cf89d60c0c8f331e962316e4b27d9437a1135036b1c816b"} Mar 09 09:10:46 crc kubenswrapper[4792]: I0309 09:10:46.947368 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.947350857 podStartE2EDuration="3.947350857s" podCreationTimestamp="2026-03-09 09:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:46.945169406 +0000 UTC m=+211.975370158" watchObservedRunningTime="2026-03-09 09:10:46.947350857 +0000 UTC m=+211.977551609" Mar 09 09:10:47 crc kubenswrapper[4792]: E0309 09:10:47.153383 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-poddf3f43f0_0799_480e_9d12_e4132e44cb11.slice/crio-conmon-4ab2a4b4555080d58cf89d60c0c8f331e962316e4b27d9437a1135036b1c816b.scope\": RecentStats: unable to find data in memory cache]" Mar 09 09:10:47 crc kubenswrapper[4792]: I0309 09:10:47.193208 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z"] Mar 09 09:10:47 crc kubenswrapper[4792]: I0309 09:10:47.496155 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8"] Mar 09 09:10:47 crc kubenswrapper[4792]: I0309 09:10:47.505796 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:47 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:47 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:47 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:47 crc kubenswrapper[4792]: I0309 09:10:47.505858 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:47 crc kubenswrapper[4792]: I0309 09:10:47.602767 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4pzr2" Mar 09 09:10:47 crc kubenswrapper[4792]: I0309 09:10:47.678887 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c" path="/var/lib/kubelet/pods/0f3faf8f-f49a-4e99-93cc-50eecd2d2c3c/volumes" Mar 09 09:10:47 crc kubenswrapper[4792]: I0309 09:10:47.956098 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"53f92eb7-0267-41c3-aead-e2600d79a1b3","Type":"ContainerStarted","Data":"ddeb7f6eb36dc94a22ebf4eb0862dce4df547894d315d9b9c6ea2cc271e51345"} Mar 09 09:10:47 crc kubenswrapper[4792]: I0309 09:10:47.960134 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" event={"ID":"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1","Type":"ContainerStarted","Data":"9564ad7736263c0d21d7133ccae444320eeb23be1b7f6c5e3da8f2a688e004f1"} Mar 09 09:10:47 crc kubenswrapper[4792]: I0309 09:10:47.981334 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" event={"ID":"26e96c8d-a866-49ce-82d7-106acf1c5f60","Type":"ContainerStarted","Data":"5acbd277116699298135b81f243d51c08ca6463049a48a041b09cf51070b10a4"} Mar 09 09:10:47 crc kubenswrapper[4792]: I0309 09:10:47.981401 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" event={"ID":"26e96c8d-a866-49ce-82d7-106acf1c5f60","Type":"ContainerStarted","Data":"672258330de52a856e7b66c7558c96ec954bbfc81e52ea917085c5de4391c978"} Mar 09 09:10:47 crc kubenswrapper[4792]: I0309 09:10:47.982498 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:47 crc kubenswrapper[4792]: I0309 09:10:47.997222 4792 generic.go:334] "Generic (PLEG): container finished" podID="df3f43f0-0799-480e-9d12-e4132e44cb11" containerID="4ab2a4b4555080d58cf89d60c0c8f331e962316e4b27d9437a1135036b1c816b" exitCode=0 Mar 09 09:10:47 crc kubenswrapper[4792]: I0309 09:10:47.997996 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"df3f43f0-0799-480e-9d12-e4132e44cb11","Type":"ContainerDied","Data":"4ab2a4b4555080d58cf89d60c0c8f331e962316e4b27d9437a1135036b1c816b"} Mar 09 09:10:48 crc kubenswrapper[4792]: I0309 09:10:48.036665 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" podStartSLOduration=4.035776594 podStartE2EDuration="4.035776594s" podCreationTimestamp="2026-03-09 09:10:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:48.025683666 +0000 UTC m=+213.055884438" watchObservedRunningTime="2026-03-09 09:10:48.035776594 +0000 UTC m=+213.065977336" Mar 09 09:10:48 crc kubenswrapper[4792]: I0309 09:10:48.367149 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:10:48 crc kubenswrapper[4792]: I0309 09:10:48.505118 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:48 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:48 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:48 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:48 crc kubenswrapper[4792]: I0309 09:10:48.505222 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:49 crc kubenswrapper[4792]: I0309 09:10:49.091841 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" event={"ID":"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1","Type":"ContainerStarted","Data":"284ec79b8a22554ce0393d69ebe2d482624a3b24cbb15aaccc6c1305ff5a5a84"} Mar 09 09:10:49 crc kubenswrapper[4792]: I0309 09:10:49.092272 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:49 crc kubenswrapper[4792]: I0309 09:10:49.113348 4792 ???:1] "http: TLS handshake error from 192.168.126.11:55614: no serving certificate available for the kubelet" Mar 09 09:10:49 crc kubenswrapper[4792]: I0309 09:10:49.125978 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" podStartSLOduration=6.12596292 podStartE2EDuration="6.12596292s" podCreationTimestamp="2026-03-09 09:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:49.121504573 +0000 UTC m=+214.151705335" watchObservedRunningTime="2026-03-09 09:10:49.12596292 +0000 UTC m=+214.156163672" Mar 09 09:10:49 crc kubenswrapper[4792]: I0309 09:10:49.134500 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:10:49 crc kubenswrapper[4792]: I0309 09:10:49.154681 4792 generic.go:334] "Generic (PLEG): container finished" podID="aadae5cd-e840-4618-b021-d8ca0e9169bd" containerID="d104d2dc86f9cb4cd3a61c24abc309fe0728bfe4b26d245e57c4e0a1793f065c" exitCode=0 Mar 09 09:10:49 crc kubenswrapper[4792]: I0309 09:10:49.154787 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" event={"ID":"aadae5cd-e840-4618-b021-d8ca0e9169bd","Type":"ContainerDied","Data":"d104d2dc86f9cb4cd3a61c24abc309fe0728bfe4b26d245e57c4e0a1793f065c"} Mar 09 09:10:49 crc kubenswrapper[4792]: I0309 09:10:49.180268 4792 generic.go:334] "Generic (PLEG): container finished" podID="53f92eb7-0267-41c3-aead-e2600d79a1b3" containerID="ddeb7f6eb36dc94a22ebf4eb0862dce4df547894d315d9b9c6ea2cc271e51345" exitCode=0 Mar 09 09:10:49 crc kubenswrapper[4792]: I0309 09:10:49.180587 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"53f92eb7-0267-41c3-aead-e2600d79a1b3","Type":"ContainerDied","Data":"ddeb7f6eb36dc94a22ebf4eb0862dce4df547894d315d9b9c6ea2cc271e51345"} Mar 09 09:10:49 crc kubenswrapper[4792]: I0309 09:10:49.495032 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:49 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:49 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:49 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:49 crc kubenswrapper[4792]: I0309 09:10:49.495120 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.366105 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.460882 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.499207 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:50 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:50 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:50 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.499281 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.518924 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df3f43f0-0799-480e-9d12-e4132e44cb11-kubelet-dir\") pod \"df3f43f0-0799-480e-9d12-e4132e44cb11\" (UID: \"df3f43f0-0799-480e-9d12-e4132e44cb11\") " Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.519040 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df3f43f0-0799-480e-9d12-e4132e44cb11-kube-api-access\") pod \"df3f43f0-0799-480e-9d12-e4132e44cb11\" (UID: \"df3f43f0-0799-480e-9d12-e4132e44cb11\") " Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.520358 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df3f43f0-0799-480e-9d12-e4132e44cb11-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "df3f43f0-0799-480e-9d12-e4132e44cb11" (UID: "df3f43f0-0799-480e-9d12-e4132e44cb11"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.544187 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df3f43f0-0799-480e-9d12-e4132e44cb11-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "df3f43f0-0799-480e-9d12-e4132e44cb11" (UID: "df3f43f0-0799-480e-9d12-e4132e44cb11"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.626510 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53f92eb7-0267-41c3-aead-e2600d79a1b3-kubelet-dir\") pod \"53f92eb7-0267-41c3-aead-e2600d79a1b3\" (UID: \"53f92eb7-0267-41c3-aead-e2600d79a1b3\") " Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.626609 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53f92eb7-0267-41c3-aead-e2600d79a1b3-kube-api-access\") pod \"53f92eb7-0267-41c3-aead-e2600d79a1b3\" (UID: \"53f92eb7-0267-41c3-aead-e2600d79a1b3\") " Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.626834 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53f92eb7-0267-41c3-aead-e2600d79a1b3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "53f92eb7-0267-41c3-aead-e2600d79a1b3" (UID: "53f92eb7-0267-41c3-aead-e2600d79a1b3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.626887 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df3f43f0-0799-480e-9d12-e4132e44cb11-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.626906 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df3f43f0-0799-480e-9d12-e4132e44cb11-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.636886 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f92eb7-0267-41c3-aead-e2600d79a1b3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "53f92eb7-0267-41c3-aead-e2600d79a1b3" (UID: "53f92eb7-0267-41c3-aead-e2600d79a1b3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.673673 4792 ???:1] "http: TLS handshake error from 192.168.126.11:55620: no serving certificate available for the kubelet" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.728652 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53f92eb7-0267-41c3-aead-e2600d79a1b3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.728684 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53f92eb7-0267-41c3-aead-e2600d79a1b3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.829944 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.830033 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.830096 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.831974 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.836361 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.836886 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.893935 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.901581 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.931281 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.931332 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.936883 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:50 crc kubenswrapper[4792]: I0309 09:10:50.938231 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4711cce5-88a9-48c4-8e2e-522062e34a03-metrics-certs\") pod \"network-metrics-daemon-fttpc\" (UID: \"4711cce5-88a9-48c4-8e2e-522062e34a03\") " pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.002301 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.135562 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slv6j\" (UniqueName: \"kubernetes.io/projected/aadae5cd-e840-4618-b021-d8ca0e9169bd-kube-api-access-slv6j\") pod \"aadae5cd-e840-4618-b021-d8ca0e9169bd\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.135743 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aadae5cd-e840-4618-b021-d8ca0e9169bd-secret-volume\") pod \"aadae5cd-e840-4618-b021-d8ca0e9169bd\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.135862 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aadae5cd-e840-4618-b021-d8ca0e9169bd-config-volume\") pod \"aadae5cd-e840-4618-b021-d8ca0e9169bd\" (UID: \"aadae5cd-e840-4618-b021-d8ca0e9169bd\") " Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.137238 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aadae5cd-e840-4618-b021-d8ca0e9169bd-config-volume" (OuterVolumeSpecName: "config-volume") pod "aadae5cd-e840-4618-b021-d8ca0e9169bd" (UID: "aadae5cd-e840-4618-b021-d8ca0e9169bd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.147792 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aadae5cd-e840-4618-b021-d8ca0e9169bd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "aadae5cd-e840-4618-b021-d8ca0e9169bd" (UID: "aadae5cd-e840-4618-b021-d8ca0e9169bd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.148867 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aadae5cd-e840-4618-b021-d8ca0e9169bd-kube-api-access-slv6j" (OuterVolumeSpecName: "kube-api-access-slv6j") pod "aadae5cd-e840-4618-b021-d8ca0e9169bd" (UID: "aadae5cd-e840-4618-b021-d8ca0e9169bd"). InnerVolumeSpecName "kube-api-access-slv6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.179626 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fttpc" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.195910 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.238420 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slv6j\" (UniqueName: \"kubernetes.io/projected/aadae5cd-e840-4618-b021-d8ca0e9169bd-kube-api-access-slv6j\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.238464 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aadae5cd-e840-4618-b021-d8ca0e9169bd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.238474 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aadae5cd-e840-4618-b021-d8ca0e9169bd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.503825 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"df3f43f0-0799-480e-9d12-e4132e44cb11","Type":"ContainerDied","Data":"3d93d90d9fb41e3d0bfc2b90a06d1441d9e1d93b02b25f389d5569184264d4cf"} Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.504161 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d93d90d9fb41e3d0bfc2b90a06d1441d9e1d93b02b25f389d5569184264d4cf" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.504280 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.512233 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:51 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:51 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:51 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.512301 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.518861 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"53f92eb7-0267-41c3-aead-e2600d79a1b3","Type":"ContainerDied","Data":"21e3b2ab7ec2456f29788a81b4a8eb648a426314f9da80ae3f079232d06c4e86"} Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.518928 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21e3b2ab7ec2456f29788a81b4a8eb648a426314f9da80ae3f079232d06c4e86" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.519115 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.523780 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-9ch2w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.523820 4792 patch_prober.go:28] interesting pod/downloads-7954f5f757-9ch2w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.523837 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9ch2w" podUID="f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.523865 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-9ch2w" podUID="f39d8cd3-a63e-4aeb-9609-fe4c8ed6372b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.530535 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.532002 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-48757" event={"ID":"aadae5cd-e840-4618-b021-d8ca0e9169bd","Type":"ContainerDied","Data":"5a6a741ce6a435148ea4ae418febdf572a7a2abb621ec146ab9ae5f1704b94a7"} Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.532098 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a6a741ce6a435148ea4ae418febdf572a7a2abb621ec146ab9ae5f1704b94a7" Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.670822 4792 patch_prober.go:28] interesting pod/console-f9d7485db-jh5pl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 09 09:10:51 crc kubenswrapper[4792]: I0309 09:10:51.670874 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jh5pl" podUID="894f7c69-0119-4c19-b205-9780fb52b06e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 09 09:10:52 crc kubenswrapper[4792]: W0309 09:10:52.048957 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-7e92d3a38b5b581f2f556cd005843cc15dc8b55c06107f17ba32755ee878f412 WatchSource:0}: Error finding container 7e92d3a38b5b581f2f556cd005843cc15dc8b55c06107f17ba32755ee878f412: Status 404 returned error can't find the container with id 7e92d3a38b5b581f2f556cd005843cc15dc8b55c06107f17ba32755ee878f412 Mar 09 09:10:52 crc kubenswrapper[4792]: I0309 09:10:52.229681 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fttpc"] Mar 09 09:10:52 crc kubenswrapper[4792]: W0309 09:10:52.301849 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-94a82c1d75d53728b33a089d06b73e3e53be92fb2be1f886f163d689d7f98976 WatchSource:0}: Error finding container 94a82c1d75d53728b33a089d06b73e3e53be92fb2be1f886f163d689d7f98976: Status 404 returned error can't find the container with id 94a82c1d75d53728b33a089d06b73e3e53be92fb2be1f886f163d689d7f98976 Mar 09 09:10:52 crc kubenswrapper[4792]: W0309 09:10:52.334430 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4711cce5_88a9_48c4_8e2e_522062e34a03.slice/crio-c0128fe9f4634d4da41a8d0183ed39dd5f2cc86d9054ecc2f7d556178d81734f WatchSource:0}: Error finding container c0128fe9f4634d4da41a8d0183ed39dd5f2cc86d9054ecc2f7d556178d81734f: Status 404 returned error can't find the container with id c0128fe9f4634d4da41a8d0183ed39dd5f2cc86d9054ecc2f7d556178d81734f Mar 09 09:10:52 crc kubenswrapper[4792]: I0309 09:10:52.492168 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:52 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:52 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:52 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:52 crc kubenswrapper[4792]: I0309 09:10:52.492229 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:52 crc kubenswrapper[4792]: I0309 09:10:52.594187 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cf3f3b31e0b076e2bf195cd38a010f5d69ea6ee8d33ffc8d9fd143880d7e914f"} Mar 09 09:10:52 crc kubenswrapper[4792]: I0309 09:10:52.599097 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fttpc" event={"ID":"4711cce5-88a9-48c4-8e2e-522062e34a03","Type":"ContainerStarted","Data":"c0128fe9f4634d4da41a8d0183ed39dd5f2cc86d9054ecc2f7d556178d81734f"} Mar 09 09:10:52 crc kubenswrapper[4792]: I0309 09:10:52.602739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"94a82c1d75d53728b33a089d06b73e3e53be92fb2be1f886f163d689d7f98976"} Mar 09 09:10:52 crc kubenswrapper[4792]: I0309 09:10:52.623612 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7e92d3a38b5b581f2f556cd005843cc15dc8b55c06107f17ba32755ee878f412"} Mar 09 09:10:53 crc kubenswrapper[4792]: I0309 09:10:53.492966 4792 patch_prober.go:28] interesting pod/router-default-5444994796-kcnhk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:10:53 crc kubenswrapper[4792]: [-]has-synced failed: reason withheld Mar 09 09:10:53 crc kubenswrapper[4792]: [+]process-running ok Mar 09 09:10:53 crc kubenswrapper[4792]: healthz check failed Mar 09 09:10:53 crc kubenswrapper[4792]: I0309 09:10:53.493081 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcnhk" podUID="83d49252-7752-48bc-86b6-c604984cd533" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:10:53 crc kubenswrapper[4792]: I0309 09:10:53.753478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"01d5383072b98899f895b65d22d7b3dccb58c7a152f97ad6facaa3a448490f03"} Mar 09 09:10:53 crc kubenswrapper[4792]: I0309 09:10:53.773556 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"446f0b50982ba365fd0e7ea66727291922642c1228d998e75522cbd5aa42e066"} Mar 09 09:10:53 crc kubenswrapper[4792]: I0309 09:10:53.828178 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1332e0003a236a43a499f1ec2c30b9212a38b086063b73129d92c88fe7bb97cd"} Mar 09 09:10:53 crc kubenswrapper[4792]: I0309 09:10:53.828696 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:10:54 crc kubenswrapper[4792]: I0309 09:10:54.496995 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:54 crc kubenswrapper[4792]: I0309 09:10:54.504156 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-kcnhk" Mar 09 09:10:54 crc kubenswrapper[4792]: I0309 09:10:54.900641 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fttpc" event={"ID":"4711cce5-88a9-48c4-8e2e-522062e34a03","Type":"ContainerStarted","Data":"478de102ee1043ed3415ca9ba11f38139ac9b9f80a6aeff22ebe63e7da9f5877"} Mar 09 09:10:55 crc kubenswrapper[4792]: I0309 09:10:55.928401 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fttpc" event={"ID":"4711cce5-88a9-48c4-8e2e-522062e34a03","Type":"ContainerStarted","Data":"1b4e27e2fca7193cf3067419a08135dda87dc690140105c9ef2b43c03f38ab84"} Mar 09 09:10:56 crc kubenswrapper[4792]: I0309 09:10:55.993295 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fttpc" podStartSLOduration=177.993268586 podStartE2EDuration="2m57.993268586s" podCreationTimestamp="2026-03-09 09:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:10:55.94939472 +0000 UTC m=+220.979595492" watchObservedRunningTime="2026-03-09 09:10:55.993268586 +0000 UTC m=+221.023469338" Mar 09 09:10:59 crc kubenswrapper[4792]: I0309 09:10:59.386332 4792 ???:1] "http: TLS handshake error from 192.168.126.11:39600: no serving certificate available for the kubelet" Mar 09 09:11:00 crc kubenswrapper[4792]: I0309 09:11:00.945624 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8"] Mar 09 09:11:00 crc kubenswrapper[4792]: I0309 09:11:00.945851 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" podUID="dd4efc19-20ad-44d0-a9e6-8115fe5cefd1" containerName="controller-manager" containerID="cri-o://284ec79b8a22554ce0393d69ebe2d482624a3b24cbb15aaccc6c1305ff5a5a84" gracePeriod=30 Mar 09 09:11:00 crc kubenswrapper[4792]: I0309 09:11:00.981718 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z"] Mar 09 09:11:00 crc kubenswrapper[4792]: I0309 09:11:00.982049 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" podUID="26e96c8d-a866-49ce-82d7-106acf1c5f60" containerName="route-controller-manager" containerID="cri-o://5acbd277116699298135b81f243d51c08ca6463049a48a041b09cf51070b10a4" gracePeriod=30 Mar 09 09:11:01 crc kubenswrapper[4792]: I0309 09:11:01.529840 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-9ch2w" Mar 09 09:11:01 crc kubenswrapper[4792]: I0309 09:11:01.674926 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:11:01 crc kubenswrapper[4792]: I0309 09:11:01.679802 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:11:02 crc kubenswrapper[4792]: I0309 09:11:02.102514 4792 generic.go:334] "Generic (PLEG): container finished" podID="dd4efc19-20ad-44d0-a9e6-8115fe5cefd1" containerID="284ec79b8a22554ce0393d69ebe2d482624a3b24cbb15aaccc6c1305ff5a5a84" exitCode=0 Mar 09 09:11:02 crc kubenswrapper[4792]: I0309 09:11:02.102635 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" event={"ID":"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1","Type":"ContainerDied","Data":"284ec79b8a22554ce0393d69ebe2d482624a3b24cbb15aaccc6c1305ff5a5a84"} Mar 09 09:11:02 crc kubenswrapper[4792]: I0309 09:11:02.110281 4792 generic.go:334] "Generic (PLEG): container finished" podID="26e96c8d-a866-49ce-82d7-106acf1c5f60" containerID="5acbd277116699298135b81f243d51c08ca6463049a48a041b09cf51070b10a4" exitCode=0 Mar 09 09:11:02 crc kubenswrapper[4792]: I0309 09:11:02.110367 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" event={"ID":"26e96c8d-a866-49ce-82d7-106acf1c5f60","Type":"ContainerDied","Data":"5acbd277116699298135b81f243d51c08ca6463049a48a041b09cf51070b10a4"} Mar 09 09:11:03 crc kubenswrapper[4792]: I0309 09:11:03.864260 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:11:06 crc kubenswrapper[4792]: I0309 09:11:06.126459 4792 patch_prober.go:28] interesting pod/controller-manager-6d8f6f6dd5-trsw8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Mar 09 09:11:06 crc kubenswrapper[4792]: I0309 09:11:06.126827 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" podUID="dd4efc19-20ad-44d0-a9e6-8115fe5cefd1" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Mar 09 09:11:06 crc kubenswrapper[4792]: I0309 09:11:06.194429 4792 patch_prober.go:28] interesting pod/route-controller-manager-956cf76db-zx46z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Mar 09 09:11:06 crc kubenswrapper[4792]: I0309 09:11:06.194516 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" podUID="26e96c8d-a866-49ce-82d7-106acf1c5f60" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Mar 09 09:11:11 crc kubenswrapper[4792]: E0309 09:11:11.338521 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 09 09:11:11 crc kubenswrapper[4792]: E0309 09:11:11.338978 4792 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 09:11:11 crc kubenswrapper[4792]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 09 09:11:11 crc kubenswrapper[4792]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mn88s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29550790-q6wbp_openshift-infra(d3934c8c-f197-4ef6-ac5c-76560a192e50): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 09 09:11:11 crc kubenswrapper[4792]: > logger="UnhandledError" Mar 09 09:11:11 crc kubenswrapper[4792]: E0309 09:11:11.340176 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29550790-q6wbp" podUID="d3934c8c-f197-4ef6-ac5c-76560a192e50" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.352172 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.358816 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.380597 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p"] Mar 09 09:11:11 crc kubenswrapper[4792]: E0309 09:11:11.380839 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f92eb7-0267-41c3-aead-e2600d79a1b3" containerName="pruner" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.380851 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f92eb7-0267-41c3-aead-e2600d79a1b3" containerName="pruner" Mar 09 09:11:11 crc kubenswrapper[4792]: E0309 09:11:11.380863 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e96c8d-a866-49ce-82d7-106acf1c5f60" containerName="route-controller-manager" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.380869 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e96c8d-a866-49ce-82d7-106acf1c5f60" containerName="route-controller-manager" Mar 09 09:11:11 crc kubenswrapper[4792]: E0309 09:11:11.380879 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4efc19-20ad-44d0-a9e6-8115fe5cefd1" containerName="controller-manager" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.380885 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4efc19-20ad-44d0-a9e6-8115fe5cefd1" containerName="controller-manager" Mar 09 09:11:11 crc kubenswrapper[4792]: E0309 09:11:11.380894 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aadae5cd-e840-4618-b021-d8ca0e9169bd" containerName="collect-profiles" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.380900 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="aadae5cd-e840-4618-b021-d8ca0e9169bd" containerName="collect-profiles" Mar 09 09:11:11 crc kubenswrapper[4792]: E0309 09:11:11.380909 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3f43f0-0799-480e-9d12-e4132e44cb11" containerName="pruner" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.380914 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3f43f0-0799-480e-9d12-e4132e44cb11" containerName="pruner" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.380993 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="df3f43f0-0799-480e-9d12-e4132e44cb11" containerName="pruner" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.381006 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e96c8d-a866-49ce-82d7-106acf1c5f60" containerName="route-controller-manager" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.381013 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f92eb7-0267-41c3-aead-e2600d79a1b3" containerName="pruner" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.381022 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="aadae5cd-e840-4618-b021-d8ca0e9169bd" containerName="collect-profiles" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.381033 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd4efc19-20ad-44d0-a9e6-8115fe5cefd1" containerName="controller-manager" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.381461 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.390554 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p"] Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.512463 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26e96c8d-a866-49ce-82d7-106acf1c5f60-client-ca\") pod \"26e96c8d-a866-49ce-82d7-106acf1c5f60\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.512511 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-serving-cert\") pod \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.512531 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl8mw\" (UniqueName: \"kubernetes.io/projected/26e96c8d-a866-49ce-82d7-106acf1c5f60-kube-api-access-fl8mw\") pod \"26e96c8d-a866-49ce-82d7-106acf1c5f60\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.512557 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e96c8d-a866-49ce-82d7-106acf1c5f60-config\") pod \"26e96c8d-a866-49ce-82d7-106acf1c5f60\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.512588 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhbjm\" (UniqueName: \"kubernetes.io/projected/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-kube-api-access-bhbjm\") pod \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.512635 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-client-ca\") pod \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.512658 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26e96c8d-a866-49ce-82d7-106acf1c5f60-serving-cert\") pod \"26e96c8d-a866-49ce-82d7-106acf1c5f60\" (UID: \"26e96c8d-a866-49ce-82d7-106acf1c5f60\") " Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.512677 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-proxy-ca-bundles\") pod \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.512737 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-config\") pod \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\" (UID: \"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1\") " Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.512935 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ed2521-f948-40d4-a374-792c139a50b8-serving-cert\") pod \"route-controller-manager-778bf95d9-pz88p\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.512972 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2ed2521-f948-40d4-a374-792c139a50b8-client-ca\") pod \"route-controller-manager-778bf95d9-pz88p\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.513005 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2ed2521-f948-40d4-a374-792c139a50b8-config\") pod \"route-controller-manager-778bf95d9-pz88p\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.513036 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc2pz\" (UniqueName: \"kubernetes.io/projected/f2ed2521-f948-40d4-a374-792c139a50b8-kube-api-access-lc2pz\") pod \"route-controller-manager-778bf95d9-pz88p\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.513467 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-client-ca" (OuterVolumeSpecName: "client-ca") pod "dd4efc19-20ad-44d0-a9e6-8115fe5cefd1" (UID: "dd4efc19-20ad-44d0-a9e6-8115fe5cefd1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.513551 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e96c8d-a866-49ce-82d7-106acf1c5f60-config" (OuterVolumeSpecName: "config") pod "26e96c8d-a866-49ce-82d7-106acf1c5f60" (UID: "26e96c8d-a866-49ce-82d7-106acf1c5f60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.513564 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "dd4efc19-20ad-44d0-a9e6-8115fe5cefd1" (UID: "dd4efc19-20ad-44d0-a9e6-8115fe5cefd1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.513932 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-config" (OuterVolumeSpecName: "config") pod "dd4efc19-20ad-44d0-a9e6-8115fe5cefd1" (UID: "dd4efc19-20ad-44d0-a9e6-8115fe5cefd1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.514461 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e96c8d-a866-49ce-82d7-106acf1c5f60-client-ca" (OuterVolumeSpecName: "client-ca") pod "26e96c8d-a866-49ce-82d7-106acf1c5f60" (UID: "26e96c8d-a866-49ce-82d7-106acf1c5f60"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.517290 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e96c8d-a866-49ce-82d7-106acf1c5f60-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "26e96c8d-a866-49ce-82d7-106acf1c5f60" (UID: "26e96c8d-a866-49ce-82d7-106acf1c5f60"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.517406 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-kube-api-access-bhbjm" (OuterVolumeSpecName: "kube-api-access-bhbjm") pod "dd4efc19-20ad-44d0-a9e6-8115fe5cefd1" (UID: "dd4efc19-20ad-44d0-a9e6-8115fe5cefd1"). InnerVolumeSpecName "kube-api-access-bhbjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.519585 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dd4efc19-20ad-44d0-a9e6-8115fe5cefd1" (UID: "dd4efc19-20ad-44d0-a9e6-8115fe5cefd1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.527407 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e96c8d-a866-49ce-82d7-106acf1c5f60-kube-api-access-fl8mw" (OuterVolumeSpecName: "kube-api-access-fl8mw") pod "26e96c8d-a866-49ce-82d7-106acf1c5f60" (UID: "26e96c8d-a866-49ce-82d7-106acf1c5f60"). InnerVolumeSpecName "kube-api-access-fl8mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.614240 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ed2521-f948-40d4-a374-792c139a50b8-serving-cert\") pod \"route-controller-manager-778bf95d9-pz88p\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.614325 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2ed2521-f948-40d4-a374-792c139a50b8-client-ca\") pod \"route-controller-manager-778bf95d9-pz88p\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.614381 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2ed2521-f948-40d4-a374-792c139a50b8-config\") pod \"route-controller-manager-778bf95d9-pz88p\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.614431 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc2pz\" (UniqueName: \"kubernetes.io/projected/f2ed2521-f948-40d4-a374-792c139a50b8-kube-api-access-lc2pz\") pod \"route-controller-manager-778bf95d9-pz88p\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.614537 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.614579 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26e96c8d-a866-49ce-82d7-106acf1c5f60-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.614595 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl8mw\" (UniqueName: \"kubernetes.io/projected/26e96c8d-a866-49ce-82d7-106acf1c5f60-kube-api-access-fl8mw\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.614610 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.614623 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e96c8d-a866-49ce-82d7-106acf1c5f60-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.614662 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhbjm\" (UniqueName: \"kubernetes.io/projected/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-kube-api-access-bhbjm\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.614676 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.614688 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26e96c8d-a866-49ce-82d7-106acf1c5f60-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.614700 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.617309 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2ed2521-f948-40d4-a374-792c139a50b8-client-ca\") pod \"route-controller-manager-778bf95d9-pz88p\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.618458 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2ed2521-f948-40d4-a374-792c139a50b8-config\") pod \"route-controller-manager-778bf95d9-pz88p\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.619529 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ed2521-f948-40d4-a374-792c139a50b8-serving-cert\") pod \"route-controller-manager-778bf95d9-pz88p\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.631904 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc2pz\" (UniqueName: \"kubernetes.io/projected/f2ed2521-f948-40d4-a374-792c139a50b8-kube-api-access-lc2pz\") pod \"route-controller-manager-778bf95d9-pz88p\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:11 crc kubenswrapper[4792]: I0309 09:11:11.717494 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:12 crc kubenswrapper[4792]: I0309 09:11:12.171811 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4ckzv" Mar 09 09:11:12 crc kubenswrapper[4792]: I0309 09:11:12.216172 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" event={"ID":"dd4efc19-20ad-44d0-a9e6-8115fe5cefd1","Type":"ContainerDied","Data":"9564ad7736263c0d21d7133ccae444320eeb23be1b7f6c5e3da8f2a688e004f1"} Mar 09 09:11:12 crc kubenswrapper[4792]: I0309 09:11:12.216243 4792 scope.go:117] "RemoveContainer" containerID="284ec79b8a22554ce0393d69ebe2d482624a3b24cbb15aaccc6c1305ff5a5a84" Mar 09 09:11:12 crc kubenswrapper[4792]: I0309 09:11:12.216296 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8" Mar 09 09:11:12 crc kubenswrapper[4792]: I0309 09:11:12.221520 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" Mar 09 09:11:12 crc kubenswrapper[4792]: I0309 09:11:12.221981 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z" event={"ID":"26e96c8d-a866-49ce-82d7-106acf1c5f60","Type":"ContainerDied","Data":"672258330de52a856e7b66c7558c96ec954bbfc81e52ea917085c5de4391c978"} Mar 09 09:11:12 crc kubenswrapper[4792]: E0309 09:11:12.235832 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29550790-q6wbp" podUID="d3934c8c-f197-4ef6-ac5c-76560a192e50" Mar 09 09:11:12 crc kubenswrapper[4792]: I0309 09:11:12.236025 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8"] Mar 09 09:11:12 crc kubenswrapper[4792]: I0309 09:11:12.241812 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6d8f6f6dd5-trsw8"] Mar 09 09:11:12 crc kubenswrapper[4792]: I0309 09:11:12.265274 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z"] Mar 09 09:11:12 crc kubenswrapper[4792]: I0309 09:11:12.268759 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-956cf76db-zx46z"] Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.214462 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.214597 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.638623 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cf7f4798c-sgc64"] Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.639523 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.646545 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.646681 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.646543 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.648819 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.649098 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.649417 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.668628 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.676651 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e96c8d-a866-49ce-82d7-106acf1c5f60" path="/var/lib/kubelet/pods/26e96c8d-a866-49ce-82d7-106acf1c5f60/volumes" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.677481 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd4efc19-20ad-44d0-a9e6-8115fe5cefd1" path="/var/lib/kubelet/pods/dd4efc19-20ad-44d0-a9e6-8115fe5cefd1/volumes" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.677892 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cf7f4798c-sgc64"] Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.755398 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-proxy-ca-bundles\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.755483 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-serving-cert\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.755509 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-client-ca\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.755528 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fhp4\" (UniqueName: \"kubernetes.io/projected/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-kube-api-access-7fhp4\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.755563 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-config\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.857087 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-config\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.857169 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-proxy-ca-bundles\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.857226 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-serving-cert\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.857259 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-client-ca\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.857280 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fhp4\" (UniqueName: \"kubernetes.io/projected/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-kube-api-access-7fhp4\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.858722 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-proxy-ca-bundles\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.858869 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-config\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.862036 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-client-ca\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.866381 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-serving-cert\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.872150 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fhp4\" (UniqueName: \"kubernetes.io/projected/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-kube-api-access-7fhp4\") pod \"controller-manager-cf7f4798c-sgc64\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:13 crc kubenswrapper[4792]: I0309 09:11:13.973891 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:18 crc kubenswrapper[4792]: I0309 09:11:18.612561 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 09:11:18 crc kubenswrapper[4792]: I0309 09:11:18.619725 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:11:18 crc kubenswrapper[4792]: I0309 09:11:18.623676 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 09:11:18 crc kubenswrapper[4792]: I0309 09:11:18.623721 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 09:11:18 crc kubenswrapper[4792]: I0309 09:11:18.625851 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 09:11:18 crc kubenswrapper[4792]: I0309 09:11:18.725444 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0da7b453-4d9f-4803-91d7-d4df282ed1d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0da7b453-4d9f-4803-91d7-d4df282ed1d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:11:18 crc kubenswrapper[4792]: I0309 09:11:18.725542 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0da7b453-4d9f-4803-91d7-d4df282ed1d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0da7b453-4d9f-4803-91d7-d4df282ed1d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:11:18 crc kubenswrapper[4792]: I0309 09:11:18.826898 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0da7b453-4d9f-4803-91d7-d4df282ed1d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0da7b453-4d9f-4803-91d7-d4df282ed1d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:11:18 crc kubenswrapper[4792]: I0309 09:11:18.826985 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0da7b453-4d9f-4803-91d7-d4df282ed1d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0da7b453-4d9f-4803-91d7-d4df282ed1d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:11:18 crc kubenswrapper[4792]: I0309 09:11:18.827084 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0da7b453-4d9f-4803-91d7-d4df282ed1d1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0da7b453-4d9f-4803-91d7-d4df282ed1d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:11:18 crc kubenswrapper[4792]: I0309 09:11:18.850093 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0da7b453-4d9f-4803-91d7-d4df282ed1d1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0da7b453-4d9f-4803-91d7-d4df282ed1d1\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:11:18 crc kubenswrapper[4792]: I0309 09:11:18.946986 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:11:20 crc kubenswrapper[4792]: I0309 09:11:20.915133 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cf7f4798c-sgc64"] Mar 09 09:11:21 crc kubenswrapper[4792]: E0309 09:11:21.036913 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 09:11:21 crc kubenswrapper[4792]: E0309 09:11:21.037220 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m2j2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jsnbn_openshift-marketplace(0667075c-38b7-4fb6-ad69-a31987eae3cc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 09:11:21 crc kubenswrapper[4792]: E0309 09:11:21.038467 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jsnbn" podUID="0667075c-38b7-4fb6-ad69-a31987eae3cc" Mar 09 09:11:21 crc kubenswrapper[4792]: I0309 09:11:21.048781 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p"] Mar 09 09:11:21 crc kubenswrapper[4792]: E0309 09:11:21.093161 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 09:11:21 crc kubenswrapper[4792]: E0309 09:11:21.093383 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-64msg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-c4nwc_openshift-marketplace(c0aa86a9-9ed9-492f-ac14-43e14abf1f2c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 09:11:21 crc kubenswrapper[4792]: E0309 09:11:21.094586 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-c4nwc" podUID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" Mar 09 09:11:21 crc kubenswrapper[4792]: E0309 09:11:21.133477 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 09 09:11:21 crc kubenswrapper[4792]: E0309 09:11:21.133758 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2brw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mj542_openshift-marketplace(a19c063e-191a-487c-a491-0af8c6fc1e3f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 09:11:21 crc kubenswrapper[4792]: E0309 09:11:21.134964 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mj542" podUID="a19c063e-191a-487c-a491-0af8c6fc1e3f" Mar 09 09:11:22 crc kubenswrapper[4792]: E0309 09:11:22.936508 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-c4nwc" podUID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" Mar 09 09:11:22 crc kubenswrapper[4792]: E0309 09:11:22.936508 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mj542" podUID="a19c063e-191a-487c-a491-0af8c6fc1e3f" Mar 09 09:11:22 crc kubenswrapper[4792]: E0309 09:11:22.936616 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jsnbn" podUID="0667075c-38b7-4fb6-ad69-a31987eae3cc" Mar 09 09:11:23 crc kubenswrapper[4792]: E0309 09:11:23.008283 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 09:11:23 crc kubenswrapper[4792]: E0309 09:11:23.008499 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bsfpd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8hkmv_openshift-marketplace(047810b2-277c-4d4c-822a-98b6d2a91fcc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 09:11:23 crc kubenswrapper[4792]: E0309 09:11:23.009699 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8hkmv" podUID="047810b2-277c-4d4c-822a-98b6d2a91fcc" Mar 09 09:11:24 crc kubenswrapper[4792]: I0309 09:11:24.217014 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 09:11:24 crc kubenswrapper[4792]: I0309 09:11:24.219688 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:11:24 crc kubenswrapper[4792]: I0309 09:11:24.223711 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 09:11:24 crc kubenswrapper[4792]: I0309 09:11:24.313188 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a30a83e-e71a-41e6-8946-34b1a6100a67-kube-api-access\") pod \"installer-9-crc\" (UID: \"8a30a83e-e71a-41e6-8946-34b1a6100a67\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:11:24 crc kubenswrapper[4792]: I0309 09:11:24.314472 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8a30a83e-e71a-41e6-8946-34b1a6100a67-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8a30a83e-e71a-41e6-8946-34b1a6100a67\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:11:24 crc kubenswrapper[4792]: I0309 09:11:24.314565 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8a30a83e-e71a-41e6-8946-34b1a6100a67-var-lock\") pod \"installer-9-crc\" (UID: \"8a30a83e-e71a-41e6-8946-34b1a6100a67\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:11:24 crc kubenswrapper[4792]: I0309 09:11:24.415985 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a30a83e-e71a-41e6-8946-34b1a6100a67-kube-api-access\") pod \"installer-9-crc\" (UID: \"8a30a83e-e71a-41e6-8946-34b1a6100a67\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:11:24 crc kubenswrapper[4792]: I0309 09:11:24.416093 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8a30a83e-e71a-41e6-8946-34b1a6100a67-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8a30a83e-e71a-41e6-8946-34b1a6100a67\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:11:24 crc kubenswrapper[4792]: I0309 09:11:24.416167 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8a30a83e-e71a-41e6-8946-34b1a6100a67-var-lock\") pod \"installer-9-crc\" (UID: \"8a30a83e-e71a-41e6-8946-34b1a6100a67\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:11:24 crc kubenswrapper[4792]: I0309 09:11:24.416257 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8a30a83e-e71a-41e6-8946-34b1a6100a67-var-lock\") pod \"installer-9-crc\" (UID: \"8a30a83e-e71a-41e6-8946-34b1a6100a67\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:11:24 crc kubenswrapper[4792]: I0309 09:11:24.416259 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8a30a83e-e71a-41e6-8946-34b1a6100a67-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8a30a83e-e71a-41e6-8946-34b1a6100a67\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:11:24 crc kubenswrapper[4792]: I0309 09:11:24.440940 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a30a83e-e71a-41e6-8946-34b1a6100a67-kube-api-access\") pod \"installer-9-crc\" (UID: \"8a30a83e-e71a-41e6-8946-34b1a6100a67\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:11:24 crc kubenswrapper[4792]: I0309 09:11:24.551441 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:11:26 crc kubenswrapper[4792]: E0309 09:11:26.106217 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8hkmv" podUID="047810b2-277c-4d4c-822a-98b6d2a91fcc" Mar 09 09:11:26 crc kubenswrapper[4792]: I0309 09:11:26.121863 4792 scope.go:117] "RemoveContainer" containerID="5acbd277116699298135b81f243d51c08ca6463049a48a041b09cf51070b10a4" Mar 09 09:11:26 crc kubenswrapper[4792]: I0309 09:11:26.611438 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p"] Mar 09 09:11:26 crc kubenswrapper[4792]: E0309 09:11:26.728767 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 09:11:26 crc kubenswrapper[4792]: E0309 09:11:26.730410 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4lq57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-t2jhd_openshift-marketplace(69535a14-c11d-442a-837d-f1d6744cb530): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 09:11:26 crc kubenswrapper[4792]: E0309 09:11:26.732489 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-t2jhd" podUID="69535a14-c11d-442a-837d-f1d6744cb530" Mar 09 09:11:26 crc kubenswrapper[4792]: E0309 09:11:26.762886 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 09:11:26 crc kubenswrapper[4792]: E0309 09:11:26.763126 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-twjdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-2d7nf_openshift-marketplace(50c5e5dd-62cf-470c-a626-27cca12c69fb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 09:11:26 crc kubenswrapper[4792]: E0309 09:11:26.764245 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-2d7nf" podUID="50c5e5dd-62cf-470c-a626-27cca12c69fb" Mar 09 09:11:26 crc kubenswrapper[4792]: E0309 09:11:26.766827 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 09:11:26 crc kubenswrapper[4792]: E0309 09:11:26.767003 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rhlfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-wtkcv_openshift-marketplace(c0500b46-411e-4371-83ae-1b148bf65ba9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 09:11:26 crc kubenswrapper[4792]: E0309 09:11:26.770816 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-wtkcv" podUID="c0500b46-411e-4371-83ae-1b148bf65ba9" Mar 09 09:11:26 crc kubenswrapper[4792]: I0309 09:11:26.887309 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 09:11:26 crc kubenswrapper[4792]: I0309 09:11:26.912983 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 09:11:26 crc kubenswrapper[4792]: W0309 09:11:26.918736 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0da7b453_4d9f_4803_91d7_d4df282ed1d1.slice/crio-4d65305a25f3834525306d34b22e424da2bf195e42f7c4573f8cf17c41906a34 WatchSource:0}: Error finding container 4d65305a25f3834525306d34b22e424da2bf195e42f7c4573f8cf17c41906a34: Status 404 returned error can't find the container with id 4d65305a25f3834525306d34b22e424da2bf195e42f7c4573f8cf17c41906a34 Mar 09 09:11:27 crc kubenswrapper[4792]: I0309 09:11:27.001544 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cf7f4798c-sgc64"] Mar 09 09:11:27 crc kubenswrapper[4792]: W0309 09:11:27.048000 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09758db8_4f8c_4a5a_9ed6_c5afe73c5c25.slice/crio-9ec78992823be334adddab63b744af48952be3e247871522eb49719ace99946f WatchSource:0}: Error finding container 9ec78992823be334adddab63b744af48952be3e247871522eb49719ace99946f: Status 404 returned error can't find the container with id 9ec78992823be334adddab63b744af48952be3e247871522eb49719ace99946f Mar 09 09:11:27 crc kubenswrapper[4792]: I0309 09:11:27.329137 4792 generic.go:334] "Generic (PLEG): container finished" podID="8606aa7f-7b07-40df-b9b8-f415a5e68b47" containerID="01e44686b530731d6555ea659a282e8642532fd8317bb2537db520507c2a5389" exitCode=0 Mar 09 09:11:27 crc kubenswrapper[4792]: I0309 09:11:27.329549 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wm7x" event={"ID":"8606aa7f-7b07-40df-b9b8-f415a5e68b47","Type":"ContainerDied","Data":"01e44686b530731d6555ea659a282e8642532fd8317bb2537db520507c2a5389"} Mar 09 09:11:27 crc kubenswrapper[4792]: I0309 09:11:27.331934 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" event={"ID":"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25","Type":"ContainerStarted","Data":"9ec78992823be334adddab63b744af48952be3e247871522eb49719ace99946f"} Mar 09 09:11:27 crc kubenswrapper[4792]: I0309 09:11:27.333761 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8a30a83e-e71a-41e6-8946-34b1a6100a67","Type":"ContainerStarted","Data":"4a75621ef30a115b54564406f512218bc54adfca3c19749b53ae1808ed40f32e"} Mar 09 09:11:27 crc kubenswrapper[4792]: I0309 09:11:27.337033 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0da7b453-4d9f-4803-91d7-d4df282ed1d1","Type":"ContainerStarted","Data":"4d65305a25f3834525306d34b22e424da2bf195e42f7c4573f8cf17c41906a34"} Mar 09 09:11:27 crc kubenswrapper[4792]: I0309 09:11:27.370977 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" event={"ID":"f2ed2521-f948-40d4-a374-792c139a50b8","Type":"ContainerStarted","Data":"f0b2dc7283cec3f1bdec479a367758eb4685ccf539436b603d504036b0015418"} Mar 09 09:11:27 crc kubenswrapper[4792]: I0309 09:11:27.371043 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" event={"ID":"f2ed2521-f948-40d4-a374-792c139a50b8","Type":"ContainerStarted","Data":"6066ece0ff6eb70208dc363b2ecc4bbcfd113110528455f4182aee1e14361a94"} Mar 09 09:11:27 crc kubenswrapper[4792]: I0309 09:11:27.371267 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" podUID="f2ed2521-f948-40d4-a374-792c139a50b8" containerName="route-controller-manager" containerID="cri-o://f0b2dc7283cec3f1bdec479a367758eb4685ccf539436b603d504036b0015418" gracePeriod=30 Mar 09 09:11:27 crc kubenswrapper[4792]: I0309 09:11:27.372055 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:27 crc kubenswrapper[4792]: E0309 09:11:27.393278 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-wtkcv" podUID="c0500b46-411e-4371-83ae-1b148bf65ba9" Mar 09 09:11:27 crc kubenswrapper[4792]: E0309 09:11:27.393359 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-t2jhd" podUID="69535a14-c11d-442a-837d-f1d6744cb530" Mar 09 09:11:27 crc kubenswrapper[4792]: E0309 09:11:27.393402 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-2d7nf" podUID="50c5e5dd-62cf-470c-a626-27cca12c69fb" Mar 09 09:11:27 crc kubenswrapper[4792]: I0309 09:11:27.435045 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" podStartSLOduration=26.435023663 podStartE2EDuration="26.435023663s" podCreationTimestamp="2026-03-09 09:11:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:11:27.403290601 +0000 UTC m=+252.433491353" watchObservedRunningTime="2026-03-09 09:11:27.435023663 +0000 UTC m=+252.465224415" Mar 09 09:11:27 crc kubenswrapper[4792]: I0309 09:11:27.987969 4792 patch_prober.go:28] interesting pod/route-controller-manager-778bf95d9-pz88p container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:33260->10.217.0.57:8443: read: connection reset by peer" start-of-body= Mar 09 09:11:27 crc kubenswrapper[4792]: I0309 09:11:27.988959 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" podUID="f2ed2521-f948-40d4-a374-792c139a50b8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:33260->10.217.0.57:8443: read: connection reset by peer" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.341692 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-778bf95d9-pz88p_f2ed2521-f948-40d4-a374-792c139a50b8/route-controller-manager/0.log" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.343533 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.391097 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb"] Mar 09 09:11:28 crc kubenswrapper[4792]: E0309 09:11:28.391362 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ed2521-f948-40d4-a374-792c139a50b8" containerName="route-controller-manager" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.391376 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ed2521-f948-40d4-a374-792c139a50b8" containerName="route-controller-manager" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.391470 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2ed2521-f948-40d4-a374-792c139a50b8" containerName="route-controller-manager" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.391911 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.408270 4792 csr.go:261] certificate signing request csr-zd9m7 is approved, waiting to be issued Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.408583 4792 csr.go:257] certificate signing request csr-zd9m7 is issued Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.421090 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8a30a83e-e71a-41e6-8946-34b1a6100a67","Type":"ContainerStarted","Data":"db1b80fa0eb69522fef8d9cb45d09c0c9fcf92dd3f0839f56ab32457397e3d50"} Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.422435 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb"] Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.425207 4792 generic.go:334] "Generic (PLEG): container finished" podID="0da7b453-4d9f-4803-91d7-d4df282ed1d1" containerID="2ccdb718f0f55d340eb8f10dbfff1b16cf3cc6e2992637e8d9c841345e8ff928" exitCode=0 Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.425444 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0da7b453-4d9f-4803-91d7-d4df282ed1d1","Type":"ContainerDied","Data":"2ccdb718f0f55d340eb8f10dbfff1b16cf3cc6e2992637e8d9c841345e8ff928"} Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.434389 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550790-q6wbp" event={"ID":"d3934c8c-f197-4ef6-ac5c-76560a192e50","Type":"ContainerStarted","Data":"ab562f4fb50dcd9d9211681341a256c3d39b392c0b54c9c2086d028f18d90ba4"} Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.436985 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-778bf95d9-pz88p_f2ed2521-f948-40d4-a374-792c139a50b8/route-controller-manager/0.log" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.437026 4792 generic.go:334] "Generic (PLEG): container finished" podID="f2ed2521-f948-40d4-a374-792c139a50b8" containerID="f0b2dc7283cec3f1bdec479a367758eb4685ccf539436b603d504036b0015418" exitCode=255 Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.437130 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" event={"ID":"f2ed2521-f948-40d4-a374-792c139a50b8","Type":"ContainerDied","Data":"f0b2dc7283cec3f1bdec479a367758eb4685ccf539436b603d504036b0015418"} Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.437156 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" event={"ID":"f2ed2521-f948-40d4-a374-792c139a50b8","Type":"ContainerDied","Data":"6066ece0ff6eb70208dc363b2ecc4bbcfd113110528455f4182aee1e14361a94"} Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.437176 4792 scope.go:117] "RemoveContainer" containerID="f0b2dc7283cec3f1bdec479a367758eb4685ccf539436b603d504036b0015418" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.437340 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.452824 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.452804943 podStartE2EDuration="4.452804943s" podCreationTimestamp="2026-03-09 09:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:11:28.451789473 +0000 UTC m=+253.481990225" watchObservedRunningTime="2026-03-09 09:11:28.452804943 +0000 UTC m=+253.483005705" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.462330 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wm7x" event={"ID":"8606aa7f-7b07-40df-b9b8-f415a5e68b47","Type":"ContainerStarted","Data":"8278f4c556a3c361ee76de1ec2559c92d8351c0452d5c28590575a683e28f53f"} Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.471147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" event={"ID":"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25","Type":"ContainerStarted","Data":"02ec6e5ae9072be566a49366e9cb1edc75daa28b5e2b21c083a0f61f6ca96c43"} Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.471363 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" podUID="09758db8-4f8c-4a5a-9ed6-c5afe73c5c25" containerName="controller-manager" containerID="cri-o://02ec6e5ae9072be566a49366e9cb1edc75daa28b5e2b21c083a0f61f6ca96c43" gracePeriod=30 Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.471980 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.478792 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.503778 4792 scope.go:117] "RemoveContainer" containerID="f0b2dc7283cec3f1bdec479a367758eb4685ccf539436b603d504036b0015418" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.506091 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8wm7x" podStartSLOduration=5.22782014 podStartE2EDuration="47.506040625s" podCreationTimestamp="2026-03-09 09:10:41 +0000 UTC" firstStartedPulling="2026-03-09 09:10:45.601334344 +0000 UTC m=+210.631535096" lastFinishedPulling="2026-03-09 09:11:27.879554829 +0000 UTC m=+252.909755581" observedRunningTime="2026-03-09 09:11:28.492397247 +0000 UTC m=+253.522598009" watchObservedRunningTime="2026-03-09 09:11:28.506040625 +0000 UTC m=+253.536241377" Mar 09 09:11:28 crc kubenswrapper[4792]: E0309 09:11:28.508280 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0b2dc7283cec3f1bdec479a367758eb4685ccf539436b603d504036b0015418\": container with ID starting with f0b2dc7283cec3f1bdec479a367758eb4685ccf539436b603d504036b0015418 not found: ID does not exist" containerID="f0b2dc7283cec3f1bdec479a367758eb4685ccf539436b603d504036b0015418" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.508339 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b2dc7283cec3f1bdec479a367758eb4685ccf539436b603d504036b0015418"} err="failed to get container status \"f0b2dc7283cec3f1bdec479a367758eb4685ccf539436b603d504036b0015418\": rpc error: code = NotFound desc = could not find container \"f0b2dc7283cec3f1bdec479a367758eb4685ccf539436b603d504036b0015418\": container with ID starting with f0b2dc7283cec3f1bdec479a367758eb4685ccf539436b603d504036b0015418 not found: ID does not exist" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.508979 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2ed2521-f948-40d4-a374-792c139a50b8-config\") pod \"f2ed2521-f948-40d4-a374-792c139a50b8\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.509259 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc2pz\" (UniqueName: \"kubernetes.io/projected/f2ed2521-f948-40d4-a374-792c139a50b8-kube-api-access-lc2pz\") pod \"f2ed2521-f948-40d4-a374-792c139a50b8\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.509375 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2ed2521-f948-40d4-a374-792c139a50b8-client-ca\") pod \"f2ed2521-f948-40d4-a374-792c139a50b8\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.509464 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ed2521-f948-40d4-a374-792c139a50b8-serving-cert\") pod \"f2ed2521-f948-40d4-a374-792c139a50b8\" (UID: \"f2ed2521-f948-40d4-a374-792c139a50b8\") " Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.509813 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-serving-cert\") pod \"route-controller-manager-6f9c9656d9-rkhjb\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.510115 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2ed2521-f948-40d4-a374-792c139a50b8-config" (OuterVolumeSpecName: "config") pod "f2ed2521-f948-40d4-a374-792c139a50b8" (UID: "f2ed2521-f948-40d4-a374-792c139a50b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.510405 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2ed2521-f948-40d4-a374-792c139a50b8-client-ca" (OuterVolumeSpecName: "client-ca") pod "f2ed2521-f948-40d4-a374-792c139a50b8" (UID: "f2ed2521-f948-40d4-a374-792c139a50b8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.511678 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550790-q6wbp" podStartSLOduration=35.386242546 podStartE2EDuration="1m28.511663764s" podCreationTimestamp="2026-03-09 09:10:00 +0000 UTC" firstStartedPulling="2026-03-09 09:10:33.552029206 +0000 UTC m=+198.582229958" lastFinishedPulling="2026-03-09 09:11:26.677450424 +0000 UTC m=+251.707651176" observedRunningTime="2026-03-09 09:11:28.510744928 +0000 UTC m=+253.540945680" watchObservedRunningTime="2026-03-09 09:11:28.511663764 +0000 UTC m=+253.541864516" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.511973 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8mck\" (UniqueName: \"kubernetes.io/projected/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-kube-api-access-l8mck\") pod \"route-controller-manager-6f9c9656d9-rkhjb\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.512038 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-client-ca\") pod \"route-controller-manager-6f9c9656d9-rkhjb\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.512223 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-config\") pod \"route-controller-manager-6f9c9656d9-rkhjb\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.516566 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2ed2521-f948-40d4-a374-792c139a50b8-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.522253 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2ed2521-f948-40d4-a374-792c139a50b8-kube-api-access-lc2pz" (OuterVolumeSpecName: "kube-api-access-lc2pz") pod "f2ed2521-f948-40d4-a374-792c139a50b8" (UID: "f2ed2521-f948-40d4-a374-792c139a50b8"). InnerVolumeSpecName "kube-api-access-lc2pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.522250 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ed2521-f948-40d4-a374-792c139a50b8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f2ed2521-f948-40d4-a374-792c139a50b8" (UID: "f2ed2521-f948-40d4-a374-792c139a50b8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.580239 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" podStartSLOduration=28.580213672 podStartE2EDuration="28.580213672s" podCreationTimestamp="2026-03-09 09:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:11:28.55937138 +0000 UTC m=+253.589572162" watchObservedRunningTime="2026-03-09 09:11:28.580213672 +0000 UTC m=+253.610414424" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.619167 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-serving-cert\") pod \"route-controller-manager-6f9c9656d9-rkhjb\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.620053 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8mck\" (UniqueName: \"kubernetes.io/projected/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-kube-api-access-l8mck\") pod \"route-controller-manager-6f9c9656d9-rkhjb\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.620186 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-client-ca\") pod \"route-controller-manager-6f9c9656d9-rkhjb\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.624799 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-config\") pod \"route-controller-manager-6f9c9656d9-rkhjb\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.625251 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc2pz\" (UniqueName: \"kubernetes.io/projected/f2ed2521-f948-40d4-a374-792c139a50b8-kube-api-access-lc2pz\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.630145 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2ed2521-f948-40d4-a374-792c139a50b8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.630166 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ed2521-f948-40d4-a374-792c139a50b8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.621703 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-client-ca\") pod \"route-controller-manager-6f9c9656d9-rkhjb\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.630246 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-config\") pod \"route-controller-manager-6f9c9656d9-rkhjb\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.629362 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-serving-cert\") pod \"route-controller-manager-6f9c9656d9-rkhjb\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.646418 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8mck\" (UniqueName: \"kubernetes.io/projected/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-kube-api-access-l8mck\") pod \"route-controller-manager-6f9c9656d9-rkhjb\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.734921 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.790252 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p"] Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.802088 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778bf95d9-pz88p"] Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.822367 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.935860 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-config\") pod \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.936423 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-proxy-ca-bundles\") pod \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.936474 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-serving-cert\") pod \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.936511 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fhp4\" (UniqueName: \"kubernetes.io/projected/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-kube-api-access-7fhp4\") pod \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.936643 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-client-ca\") pod \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\" (UID: \"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25\") " Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.937162 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "09758db8-4f8c-4a5a-9ed6-c5afe73c5c25" (UID: "09758db8-4f8c-4a5a-9ed6-c5afe73c5c25"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.937548 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-client-ca" (OuterVolumeSpecName: "client-ca") pod "09758db8-4f8c-4a5a-9ed6-c5afe73c5c25" (UID: "09758db8-4f8c-4a5a-9ed6-c5afe73c5c25"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.937943 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-config" (OuterVolumeSpecName: "config") pod "09758db8-4f8c-4a5a-9ed6-c5afe73c5c25" (UID: "09758db8-4f8c-4a5a-9ed6-c5afe73c5c25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.947612 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09758db8-4f8c-4a5a-9ed6-c5afe73c5c25" (UID: "09758db8-4f8c-4a5a-9ed6-c5afe73c5c25"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:11:28 crc kubenswrapper[4792]: I0309 09:11:28.948576 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-kube-api-access-7fhp4" (OuterVolumeSpecName: "kube-api-access-7fhp4") pod "09758db8-4f8c-4a5a-9ed6-c5afe73c5c25" (UID: "09758db8-4f8c-4a5a-9ed6-c5afe73c5c25"). InnerVolumeSpecName "kube-api-access-7fhp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.001708 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb"] Mar 09 09:11:29 crc kubenswrapper[4792]: W0309 09:11:29.015561 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa9edf1d_0ef2_4b57_8cf0_2cf8b3cd1eb9.slice/crio-409b9a0f236ae1aa00ef49f6652a06c7ee1f22450d3bd9238856d29766672194 WatchSource:0}: Error finding container 409b9a0f236ae1aa00ef49f6652a06c7ee1f22450d3bd9238856d29766672194: Status 404 returned error can't find the container with id 409b9a0f236ae1aa00ef49f6652a06c7ee1f22450d3bd9238856d29766672194 Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.037953 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.037988 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.038004 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fhp4\" (UniqueName: \"kubernetes.io/projected/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-kube-api-access-7fhp4\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.038014 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.038023 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.409590 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-08 20:33:09.106243901 +0000 UTC Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.410111 4792 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7331h21m39.696137166s for next certificate rotation Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.483744 4792 generic.go:334] "Generic (PLEG): container finished" podID="d3934c8c-f197-4ef6-ac5c-76560a192e50" containerID="ab562f4fb50dcd9d9211681341a256c3d39b392c0b54c9c2086d028f18d90ba4" exitCode=0 Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.483828 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550790-q6wbp" event={"ID":"d3934c8c-f197-4ef6-ac5c-76560a192e50","Type":"ContainerDied","Data":"ab562f4fb50dcd9d9211681341a256c3d39b392c0b54c9c2086d028f18d90ba4"} Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.488512 4792 generic.go:334] "Generic (PLEG): container finished" podID="09758db8-4f8c-4a5a-9ed6-c5afe73c5c25" containerID="02ec6e5ae9072be566a49366e9cb1edc75daa28b5e2b21c083a0f61f6ca96c43" exitCode=0 Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.488634 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.489511 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" event={"ID":"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25","Type":"ContainerDied","Data":"02ec6e5ae9072be566a49366e9cb1edc75daa28b5e2b21c083a0f61f6ca96c43"} Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.489541 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cf7f4798c-sgc64" event={"ID":"09758db8-4f8c-4a5a-9ed6-c5afe73c5c25","Type":"ContainerDied","Data":"9ec78992823be334adddab63b744af48952be3e247871522eb49719ace99946f"} Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.489563 4792 scope.go:117] "RemoveContainer" containerID="02ec6e5ae9072be566a49366e9cb1edc75daa28b5e2b21c083a0f61f6ca96c43" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.492182 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" event={"ID":"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9","Type":"ContainerStarted","Data":"8ab6e582992e9ebd34d0f2f59b5bd3c222d0eb245f7ea94443772bdd69da5e69"} Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.492202 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" event={"ID":"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9","Type":"ContainerStarted","Data":"409b9a0f236ae1aa00ef49f6652a06c7ee1f22450d3bd9238856d29766672194"} Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.495464 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.524086 4792 scope.go:117] "RemoveContainer" containerID="02ec6e5ae9072be566a49366e9cb1edc75daa28b5e2b21c083a0f61f6ca96c43" Mar 09 09:11:29 crc kubenswrapper[4792]: E0309 09:11:29.524640 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ec6e5ae9072be566a49366e9cb1edc75daa28b5e2b21c083a0f61f6ca96c43\": container with ID starting with 02ec6e5ae9072be566a49366e9cb1edc75daa28b5e2b21c083a0f61f6ca96c43 not found: ID does not exist" containerID="02ec6e5ae9072be566a49366e9cb1edc75daa28b5e2b21c083a0f61f6ca96c43" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.524672 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ec6e5ae9072be566a49366e9cb1edc75daa28b5e2b21c083a0f61f6ca96c43"} err="failed to get container status \"02ec6e5ae9072be566a49366e9cb1edc75daa28b5e2b21c083a0f61f6ca96c43\": rpc error: code = NotFound desc = could not find container \"02ec6e5ae9072be566a49366e9cb1edc75daa28b5e2b21c083a0f61f6ca96c43\": container with ID starting with 02ec6e5ae9072be566a49366e9cb1edc75daa28b5e2b21c083a0f61f6ca96c43 not found: ID does not exist" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.555013 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" podStartSLOduration=8.55499478 podStartE2EDuration="8.55499478s" podCreationTimestamp="2026-03-09 09:11:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:11:29.533678045 +0000 UTC m=+254.563878797" watchObservedRunningTime="2026-03-09 09:11:29.55499478 +0000 UTC m=+254.585195532" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.556128 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cf7f4798c-sgc64"] Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.558381 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-cf7f4798c-sgc64"] Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.581465 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.671736 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09758db8-4f8c-4a5a-9ed6-c5afe73c5c25" path="/var/lib/kubelet/pods/09758db8-4f8c-4a5a-9ed6-c5afe73c5c25/volumes" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.672305 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2ed2521-f948-40d4-a374-792c139a50b8" path="/var/lib/kubelet/pods/f2ed2521-f948-40d4-a374-792c139a50b8/volumes" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.840645 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.953026 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0da7b453-4d9f-4803-91d7-d4df282ed1d1-kubelet-dir\") pod \"0da7b453-4d9f-4803-91d7-d4df282ed1d1\" (UID: \"0da7b453-4d9f-4803-91d7-d4df282ed1d1\") " Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.953173 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0da7b453-4d9f-4803-91d7-d4df282ed1d1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0da7b453-4d9f-4803-91d7-d4df282ed1d1" (UID: "0da7b453-4d9f-4803-91d7-d4df282ed1d1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.953250 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0da7b453-4d9f-4803-91d7-d4df282ed1d1-kube-api-access\") pod \"0da7b453-4d9f-4803-91d7-d4df282ed1d1\" (UID: \"0da7b453-4d9f-4803-91d7-d4df282ed1d1\") " Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.953506 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0da7b453-4d9f-4803-91d7-d4df282ed1d1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:29 crc kubenswrapper[4792]: I0309 09:11:29.959712 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da7b453-4d9f-4803-91d7-d4df282ed1d1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0da7b453-4d9f-4803-91d7-d4df282ed1d1" (UID: "0da7b453-4d9f-4803-91d7-d4df282ed1d1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.054551 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0da7b453-4d9f-4803-91d7-d4df282ed1d1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.503249 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.503278 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0da7b453-4d9f-4803-91d7-d4df282ed1d1","Type":"ContainerDied","Data":"4d65305a25f3834525306d34b22e424da2bf195e42f7c4573f8cf17c41906a34"} Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.503541 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d65305a25f3834525306d34b22e424da2bf195e42f7c4573f8cf17c41906a34" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.660561 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8d69864c9-v4m9r"] Mar 09 09:11:30 crc kubenswrapper[4792]: E0309 09:11:30.660833 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09758db8-4f8c-4a5a-9ed6-c5afe73c5c25" containerName="controller-manager" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.660846 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="09758db8-4f8c-4a5a-9ed6-c5afe73c5c25" containerName="controller-manager" Mar 09 09:11:30 crc kubenswrapper[4792]: E0309 09:11:30.660863 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da7b453-4d9f-4803-91d7-d4df282ed1d1" containerName="pruner" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.660872 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da7b453-4d9f-4803-91d7-d4df282ed1d1" containerName="pruner" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.661008 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="09758db8-4f8c-4a5a-9ed6-c5afe73c5c25" containerName="controller-manager" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.661021 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da7b453-4d9f-4803-91d7-d4df282ed1d1" containerName="pruner" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.661467 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.665685 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.665890 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.666502 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.675244 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.675505 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.675605 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.680781 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8d69864c9-v4m9r"] Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.680973 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.765225 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-proxy-ca-bundles\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.765285 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npvzc\" (UniqueName: \"kubernetes.io/projected/5c1159c9-ec48-4f11-98d0-893333a427e2-kube-api-access-npvzc\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.765351 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c1159c9-ec48-4f11-98d0-893333a427e2-serving-cert\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.765383 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-client-ca\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.765417 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-config\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.791466 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550790-q6wbp" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.869753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-proxy-ca-bundles\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.869816 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npvzc\" (UniqueName: \"kubernetes.io/projected/5c1159c9-ec48-4f11-98d0-893333a427e2-kube-api-access-npvzc\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.869919 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c1159c9-ec48-4f11-98d0-893333a427e2-serving-cert\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.869958 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-client-ca\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.869993 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-config\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.871038 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-proxy-ca-bundles\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.871188 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-config\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.871755 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-client-ca\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.880154 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c1159c9-ec48-4f11-98d0-893333a427e2-serving-cert\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.891602 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npvzc\" (UniqueName: \"kubernetes.io/projected/5c1159c9-ec48-4f11-98d0-893333a427e2-kube-api-access-npvzc\") pod \"controller-manager-8d69864c9-v4m9r\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.971344 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn88s\" (UniqueName: \"kubernetes.io/projected/d3934c8c-f197-4ef6-ac5c-76560a192e50-kube-api-access-mn88s\") pod \"d3934c8c-f197-4ef6-ac5c-76560a192e50\" (UID: \"d3934c8c-f197-4ef6-ac5c-76560a192e50\") " Mar 09 09:11:30 crc kubenswrapper[4792]: I0309 09:11:30.975043 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3934c8c-f197-4ef6-ac5c-76560a192e50-kube-api-access-mn88s" (OuterVolumeSpecName: "kube-api-access-mn88s") pod "d3934c8c-f197-4ef6-ac5c-76560a192e50" (UID: "d3934c8c-f197-4ef6-ac5c-76560a192e50"). InnerVolumeSpecName "kube-api-access-mn88s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:31 crc kubenswrapper[4792]: I0309 09:11:31.007993 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:31 crc kubenswrapper[4792]: I0309 09:11:31.073319 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn88s\" (UniqueName: \"kubernetes.io/projected/d3934c8c-f197-4ef6-ac5c-76560a192e50-kube-api-access-mn88s\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:31 crc kubenswrapper[4792]: I0309 09:11:31.207222 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:11:31 crc kubenswrapper[4792]: I0309 09:11:31.424479 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8d69864c9-v4m9r"] Mar 09 09:11:31 crc kubenswrapper[4792]: W0309 09:11:31.433291 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c1159c9_ec48_4f11_98d0_893333a427e2.slice/crio-88341f235d6cfe0546506b247611e31f278714b2fcfa0c0f35d070d7a484588a WatchSource:0}: Error finding container 88341f235d6cfe0546506b247611e31f278714b2fcfa0c0f35d070d7a484588a: Status 404 returned error can't find the container with id 88341f235d6cfe0546506b247611e31f278714b2fcfa0c0f35d070d7a484588a Mar 09 09:11:31 crc kubenswrapper[4792]: I0309 09:11:31.522767 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" event={"ID":"5c1159c9-ec48-4f11-98d0-893333a427e2","Type":"ContainerStarted","Data":"88341f235d6cfe0546506b247611e31f278714b2fcfa0c0f35d070d7a484588a"} Mar 09 09:11:31 crc kubenswrapper[4792]: I0309 09:11:31.526184 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550790-q6wbp" Mar 09 09:11:31 crc kubenswrapper[4792]: I0309 09:11:31.526193 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550790-q6wbp" event={"ID":"d3934c8c-f197-4ef6-ac5c-76560a192e50","Type":"ContainerDied","Data":"3cc6753455e1631f985f3dfdaab258d74d9a79eb9299f20d3e62c1428b0b3989"} Mar 09 09:11:31 crc kubenswrapper[4792]: I0309 09:11:31.526259 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cc6753455e1631f985f3dfdaab258d74d9a79eb9299f20d3e62c1428b0b3989" Mar 09 09:11:32 crc kubenswrapper[4792]: I0309 09:11:32.535875 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" event={"ID":"5c1159c9-ec48-4f11-98d0-893333a427e2","Type":"ContainerStarted","Data":"d61930d6cace9c81129f65a2fd8d91e2097317ce237cb1d0b81a22602ad9273a"} Mar 09 09:11:32 crc kubenswrapper[4792]: I0309 09:11:32.537668 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:32 crc kubenswrapper[4792]: I0309 09:11:32.548376 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:32 crc kubenswrapper[4792]: I0309 09:11:32.558723 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" podStartSLOduration=12.55870111 podStartE2EDuration="12.55870111s" podCreationTimestamp="2026-03-09 09:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:11:32.554626675 +0000 UTC m=+257.584827427" watchObservedRunningTime="2026-03-09 09:11:32.55870111 +0000 UTC m=+257.588901922" Mar 09 09:11:33 crc kubenswrapper[4792]: I0309 09:11:33.462803 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:11:33 crc kubenswrapper[4792]: I0309 09:11:33.463294 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:11:33 crc kubenswrapper[4792]: I0309 09:11:33.648274 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:11:33 crc kubenswrapper[4792]: I0309 09:11:33.734384 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:11:35 crc kubenswrapper[4792]: I0309 09:11:35.558912 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mj542" event={"ID":"a19c063e-191a-487c-a491-0af8c6fc1e3f","Type":"ContainerStarted","Data":"35117fdc604b3ef1af1034456a1f7bb3d85d4e5dc802e188d8b19425889da322"} Mar 09 09:11:35 crc kubenswrapper[4792]: I0309 09:11:35.570354 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4nwc" event={"ID":"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c","Type":"ContainerStarted","Data":"c251e1816aa2ae69481d1f29b55021e0a8ebd3eef2d59cafbf8af6292f4cfd00"} Mar 09 09:11:36 crc kubenswrapper[4792]: I0309 09:11:36.579301 4792 generic.go:334] "Generic (PLEG): container finished" podID="0667075c-38b7-4fb6-ad69-a31987eae3cc" containerID="0a2450ea3aee9ffbc6ba511e5ba06e3f63e426a3d5a7691de28f7994af64228c" exitCode=0 Mar 09 09:11:36 crc kubenswrapper[4792]: I0309 09:11:36.579364 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsnbn" event={"ID":"0667075c-38b7-4fb6-ad69-a31987eae3cc","Type":"ContainerDied","Data":"0a2450ea3aee9ffbc6ba511e5ba06e3f63e426a3d5a7691de28f7994af64228c"} Mar 09 09:11:36 crc kubenswrapper[4792]: I0309 09:11:36.584235 4792 generic.go:334] "Generic (PLEG): container finished" podID="a19c063e-191a-487c-a491-0af8c6fc1e3f" containerID="35117fdc604b3ef1af1034456a1f7bb3d85d4e5dc802e188d8b19425889da322" exitCode=0 Mar 09 09:11:36 crc kubenswrapper[4792]: I0309 09:11:36.584291 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mj542" event={"ID":"a19c063e-191a-487c-a491-0af8c6fc1e3f","Type":"ContainerDied","Data":"35117fdc604b3ef1af1034456a1f7bb3d85d4e5dc802e188d8b19425889da322"} Mar 09 09:11:36 crc kubenswrapper[4792]: I0309 09:11:36.589412 4792 generic.go:334] "Generic (PLEG): container finished" podID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" containerID="c251e1816aa2ae69481d1f29b55021e0a8ebd3eef2d59cafbf8af6292f4cfd00" exitCode=0 Mar 09 09:11:36 crc kubenswrapper[4792]: I0309 09:11:36.589449 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4nwc" event={"ID":"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c","Type":"ContainerDied","Data":"c251e1816aa2ae69481d1f29b55021e0a8ebd3eef2d59cafbf8af6292f4cfd00"} Mar 09 09:11:38 crc kubenswrapper[4792]: I0309 09:11:38.611883 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mj542" event={"ID":"a19c063e-191a-487c-a491-0af8c6fc1e3f","Type":"ContainerStarted","Data":"7523a1cacb1393a013e37df62e1c59310b854237ad47937a8ff32cd51c6af927"} Mar 09 09:11:39 crc kubenswrapper[4792]: I0309 09:11:39.624927 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4nwc" event={"ID":"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c","Type":"ContainerStarted","Data":"9b21ea77a7f25dff02c6e7dafdc5b8e63883444513fad376003818c6cfcbfd8a"} Mar 09 09:11:39 crc kubenswrapper[4792]: I0309 09:11:39.626975 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtkcv" event={"ID":"c0500b46-411e-4371-83ae-1b148bf65ba9","Type":"ContainerStarted","Data":"2f4dd7cea731820ef28d8f419593689c6c007509f1874234069c84d810af9e79"} Mar 09 09:11:39 crc kubenswrapper[4792]: I0309 09:11:39.629812 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsnbn" event={"ID":"0667075c-38b7-4fb6-ad69-a31987eae3cc","Type":"ContainerStarted","Data":"3a9bd1731a5918976ad5f77b3bae28fac98086487f7a1cbcc0de527b1f267e90"} Mar 09 09:11:39 crc kubenswrapper[4792]: I0309 09:11:39.671213 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c4nwc" podStartSLOduration=5.008151017 podStartE2EDuration="56.671197866s" podCreationTimestamp="2026-03-09 09:10:43 +0000 UTC" firstStartedPulling="2026-03-09 09:10:46.74359246 +0000 UTC m=+211.773793212" lastFinishedPulling="2026-03-09 09:11:38.406639309 +0000 UTC m=+263.436840061" observedRunningTime="2026-03-09 09:11:39.652890867 +0000 UTC m=+264.683091609" watchObservedRunningTime="2026-03-09 09:11:39.671197866 +0000 UTC m=+264.701398618" Mar 09 09:11:39 crc kubenswrapper[4792]: I0309 09:11:39.674277 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mj542" podStartSLOduration=4.864147533 podStartE2EDuration="57.674269066s" podCreationTimestamp="2026-03-09 09:10:42 +0000 UTC" firstStartedPulling="2026-03-09 09:10:45.587752148 +0000 UTC m=+210.617952900" lastFinishedPulling="2026-03-09 09:11:38.397873681 +0000 UTC m=+263.428074433" observedRunningTime="2026-03-09 09:11:39.669034623 +0000 UTC m=+264.699235385" watchObservedRunningTime="2026-03-09 09:11:39.674269066 +0000 UTC m=+264.704469818" Mar 09 09:11:39 crc kubenswrapper[4792]: I0309 09:11:39.690471 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jsnbn" podStartSLOduration=4.797211735 podStartE2EDuration="57.690449582s" podCreationTimestamp="2026-03-09 09:10:42 +0000 UTC" firstStartedPulling="2026-03-09 09:10:45.587353347 +0000 UTC m=+210.617554099" lastFinishedPulling="2026-03-09 09:11:38.480591194 +0000 UTC m=+263.510791946" observedRunningTime="2026-03-09 09:11:39.690427172 +0000 UTC m=+264.720627924" watchObservedRunningTime="2026-03-09 09:11:39.690449582 +0000 UTC m=+264.720650334" Mar 09 09:11:40 crc kubenswrapper[4792]: I0309 09:11:40.637020 4792 generic.go:334] "Generic (PLEG): container finished" podID="c0500b46-411e-4371-83ae-1b148bf65ba9" containerID="2f4dd7cea731820ef28d8f419593689c6c007509f1874234069c84d810af9e79" exitCode=0 Mar 09 09:11:40 crc kubenswrapper[4792]: I0309 09:11:40.637147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtkcv" event={"ID":"c0500b46-411e-4371-83ae-1b148bf65ba9","Type":"ContainerDied","Data":"2f4dd7cea731820ef28d8f419593689c6c007509f1874234069c84d810af9e79"} Mar 09 09:11:40 crc kubenswrapper[4792]: I0309 09:11:40.640713 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2jhd" event={"ID":"69535a14-c11d-442a-837d-f1d6744cb530","Type":"ContainerStarted","Data":"d2b10ee92a9779d15f6d383d469f9c54a2a5c72d8b8e4181b2214aa0a5a6ce91"} Mar 09 09:11:40 crc kubenswrapper[4792]: I0309 09:11:40.904871 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8d69864c9-v4m9r"] Mar 09 09:11:40 crc kubenswrapper[4792]: I0309 09:11:40.905436 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" podUID="5c1159c9-ec48-4f11-98d0-893333a427e2" containerName="controller-manager" containerID="cri-o://d61930d6cace9c81129f65a2fd8d91e2097317ce237cb1d0b81a22602ad9273a" gracePeriod=30 Mar 09 09:11:40 crc kubenswrapper[4792]: I0309 09:11:40.917160 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb"] Mar 09 09:11:40 crc kubenswrapper[4792]: I0309 09:11:40.917436 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" podUID="fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9" containerName="route-controller-manager" containerID="cri-o://8ab6e582992e9ebd34d0f2f59b5bd3c222d0eb245f7ea94443772bdd69da5e69" gracePeriod=30 Mar 09 09:11:41 crc kubenswrapper[4792]: I0309 09:11:41.009699 4792 patch_prober.go:28] interesting pod/controller-manager-8d69864c9-v4m9r container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Mar 09 09:11:41 crc kubenswrapper[4792]: I0309 09:11:41.009786 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" podUID="5c1159c9-ec48-4f11-98d0-893333a427e2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Mar 09 09:11:41 crc kubenswrapper[4792]: I0309 09:11:41.648286 4792 generic.go:334] "Generic (PLEG): container finished" podID="69535a14-c11d-442a-837d-f1d6744cb530" containerID="d2b10ee92a9779d15f6d383d469f9c54a2a5c72d8b8e4181b2214aa0a5a6ce91" exitCode=0 Mar 09 09:11:41 crc kubenswrapper[4792]: I0309 09:11:41.648371 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2jhd" event={"ID":"69535a14-c11d-442a-837d-f1d6744cb530","Type":"ContainerDied","Data":"d2b10ee92a9779d15f6d383d469f9c54a2a5c72d8b8e4181b2214aa0a5a6ce91"} Mar 09 09:11:41 crc kubenswrapper[4792]: I0309 09:11:41.649646 4792 generic.go:334] "Generic (PLEG): container finished" podID="fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9" containerID="8ab6e582992e9ebd34d0f2f59b5bd3c222d0eb245f7ea94443772bdd69da5e69" exitCode=0 Mar 09 09:11:41 crc kubenswrapper[4792]: I0309 09:11:41.649704 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" event={"ID":"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9","Type":"ContainerDied","Data":"8ab6e582992e9ebd34d0f2f59b5bd3c222d0eb245f7ea94443772bdd69da5e69"} Mar 09 09:11:41 crc kubenswrapper[4792]: I0309 09:11:41.651854 4792 generic.go:334] "Generic (PLEG): container finished" podID="5c1159c9-ec48-4f11-98d0-893333a427e2" containerID="d61930d6cace9c81129f65a2fd8d91e2097317ce237cb1d0b81a22602ad9273a" exitCode=0 Mar 09 09:11:41 crc kubenswrapper[4792]: I0309 09:11:41.651889 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" event={"ID":"5c1159c9-ec48-4f11-98d0-893333a427e2","Type":"ContainerDied","Data":"d61930d6cace9c81129f65a2fd8d91e2097317ce237cb1d0b81a22602ad9273a"} Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.064092 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.097887 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6"] Mar 09 09:11:42 crc kubenswrapper[4792]: E0309 09:11:42.098198 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3934c8c-f197-4ef6-ac5c-76560a192e50" containerName="oc" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.098217 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3934c8c-f197-4ef6-ac5c-76560a192e50" containerName="oc" Mar 09 09:11:42 crc kubenswrapper[4792]: E0309 09:11:42.098239 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9" containerName="route-controller-manager" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.098246 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9" containerName="route-controller-manager" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.098351 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3934c8c-f197-4ef6-ac5c-76560a192e50" containerName="oc" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.098370 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9" containerName="route-controller-manager" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.098769 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.123334 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6"] Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.198349 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-client-ca\") pod \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.198416 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-serving-cert\") pod \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.198439 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8mck\" (UniqueName: \"kubernetes.io/projected/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-kube-api-access-l8mck\") pod \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.198540 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-config\") pod \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\" (UID: \"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9\") " Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.198724 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgrkt\" (UniqueName: \"kubernetes.io/projected/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-kube-api-access-pgrkt\") pod \"route-controller-manager-7844b84bbd-6dsp6\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.198775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-client-ca\") pod \"route-controller-manager-7844b84bbd-6dsp6\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.198810 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-config\") pod \"route-controller-manager-7844b84bbd-6dsp6\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.198849 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-serving-cert\") pod \"route-controller-manager-7844b84bbd-6dsp6\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.199668 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-client-ca" (OuterVolumeSpecName: "client-ca") pod "fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9" (UID: "fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.199758 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-config" (OuterVolumeSpecName: "config") pod "fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9" (UID: "fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.205163 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-kube-api-access-l8mck" (OuterVolumeSpecName: "kube-api-access-l8mck") pod "fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9" (UID: "fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9"). InnerVolumeSpecName "kube-api-access-l8mck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.205453 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9" (UID: "fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.258316 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.299368 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npvzc\" (UniqueName: \"kubernetes.io/projected/5c1159c9-ec48-4f11-98d0-893333a427e2-kube-api-access-npvzc\") pod \"5c1159c9-ec48-4f11-98d0-893333a427e2\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.299408 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-proxy-ca-bundles\") pod \"5c1159c9-ec48-4f11-98d0-893333a427e2\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.299521 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c1159c9-ec48-4f11-98d0-893333a427e2-serving-cert\") pod \"5c1159c9-ec48-4f11-98d0-893333a427e2\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.299572 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-client-ca\") pod \"5c1159c9-ec48-4f11-98d0-893333a427e2\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.299597 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-config\") pod \"5c1159c9-ec48-4f11-98d0-893333a427e2\" (UID: \"5c1159c9-ec48-4f11-98d0-893333a427e2\") " Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.299847 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-serving-cert\") pod \"route-controller-manager-7844b84bbd-6dsp6\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.299887 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgrkt\" (UniqueName: \"kubernetes.io/projected/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-kube-api-access-pgrkt\") pod \"route-controller-manager-7844b84bbd-6dsp6\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.299923 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-client-ca\") pod \"route-controller-manager-7844b84bbd-6dsp6\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.299986 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-config\") pod \"route-controller-manager-7844b84bbd-6dsp6\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.300040 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.300113 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.300128 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.300141 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8mck\" (UniqueName: \"kubernetes.io/projected/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9-kube-api-access-l8mck\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.300822 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5c1159c9-ec48-4f11-98d0-893333a427e2" (UID: "5c1159c9-ec48-4f11-98d0-893333a427e2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.301330 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-config\") pod \"route-controller-manager-7844b84bbd-6dsp6\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.301456 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-config" (OuterVolumeSpecName: "config") pod "5c1159c9-ec48-4f11-98d0-893333a427e2" (UID: "5c1159c9-ec48-4f11-98d0-893333a427e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.302099 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-client-ca" (OuterVolumeSpecName: "client-ca") pod "5c1159c9-ec48-4f11-98d0-893333a427e2" (UID: "5c1159c9-ec48-4f11-98d0-893333a427e2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.302394 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-client-ca\") pod \"route-controller-manager-7844b84bbd-6dsp6\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.305754 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c1159c9-ec48-4f11-98d0-893333a427e2-kube-api-access-npvzc" (OuterVolumeSpecName: "kube-api-access-npvzc") pod "5c1159c9-ec48-4f11-98d0-893333a427e2" (UID: "5c1159c9-ec48-4f11-98d0-893333a427e2"). InnerVolumeSpecName "kube-api-access-npvzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.306191 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-serving-cert\") pod \"route-controller-manager-7844b84bbd-6dsp6\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.306378 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c1159c9-ec48-4f11-98d0-893333a427e2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5c1159c9-ec48-4f11-98d0-893333a427e2" (UID: "5c1159c9-ec48-4f11-98d0-893333a427e2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.329861 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgrkt\" (UniqueName: \"kubernetes.io/projected/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-kube-api-access-pgrkt\") pod \"route-controller-manager-7844b84bbd-6dsp6\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.401747 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c1159c9-ec48-4f11-98d0-893333a427e2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.402121 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.402135 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.402147 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npvzc\" (UniqueName: \"kubernetes.io/projected/5c1159c9-ec48-4f11-98d0-893333a427e2-kube-api-access-npvzc\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.402156 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c1159c9-ec48-4f11-98d0-893333a427e2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.441507 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.675992 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2jhd" event={"ID":"69535a14-c11d-442a-837d-f1d6744cb530","Type":"ContainerStarted","Data":"c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8"} Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.686100 4792 generic.go:334] "Generic (PLEG): container finished" podID="047810b2-277c-4d4c-822a-98b6d2a91fcc" containerID="815164b353e5081b24d2413cf5c92d3a572812d2d8925e1e5317af231c0d02f7" exitCode=0 Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.686165 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hkmv" event={"ID":"047810b2-277c-4d4c-822a-98b6d2a91fcc","Type":"ContainerDied","Data":"815164b353e5081b24d2413cf5c92d3a572812d2d8925e1e5317af231c0d02f7"} Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.701062 4792 generic.go:334] "Generic (PLEG): container finished" podID="50c5e5dd-62cf-470c-a626-27cca12c69fb" containerID="46943d8495d95b424e01e9c9770ab664bb81bcf005af9baf9109c88314ca54c7" exitCode=0 Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.701221 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d7nf" event={"ID":"50c5e5dd-62cf-470c-a626-27cca12c69fb","Type":"ContainerDied","Data":"46943d8495d95b424e01e9c9770ab664bb81bcf005af9baf9109c88314ca54c7"} Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.704199 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t2jhd" podStartSLOduration=5.918363285 podStartE2EDuration="1m3.704185711s" podCreationTimestamp="2026-03-09 09:10:39 +0000 UTC" firstStartedPulling="2026-03-09 09:10:44.383293736 +0000 UTC m=+209.413494488" lastFinishedPulling="2026-03-09 09:11:42.169116162 +0000 UTC m=+267.199316914" observedRunningTime="2026-03-09 09:11:42.702398259 +0000 UTC m=+267.732599011" watchObservedRunningTime="2026-03-09 09:11:42.704185711 +0000 UTC m=+267.734386463" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.714992 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtkcv" event={"ID":"c0500b46-411e-4371-83ae-1b148bf65ba9","Type":"ContainerStarted","Data":"287c47b44be20cd8d4219506710d16d385938a92bfdb51b9f77c5d44c8d6190f"} Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.717511 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.719122 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.721764 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" event={"ID":"fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9","Type":"ContainerDied","Data":"409b9a0f236ae1aa00ef49f6652a06c7ee1f22450d3bd9238856d29766672194"} Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.721821 4792 scope.go:117] "RemoveContainer" containerID="8ab6e582992e9ebd34d0f2f59b5bd3c222d0eb245f7ea94443772bdd69da5e69" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.721971 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.728060 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" event={"ID":"5c1159c9-ec48-4f11-98d0-893333a427e2","Type":"ContainerDied","Data":"88341f235d6cfe0546506b247611e31f278714b2fcfa0c0f35d070d7a484588a"} Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.728488 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8d69864c9-v4m9r" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.775357 4792 scope.go:117] "RemoveContainer" containerID="d61930d6cace9c81129f65a2fd8d91e2097317ce237cb1d0b81a22602ad9273a" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.796715 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wtkcv" podStartSLOduration=6.728362015 podStartE2EDuration="1m2.796688002s" podCreationTimestamp="2026-03-09 09:10:40 +0000 UTC" firstStartedPulling="2026-03-09 09:10:45.53220431 +0000 UTC m=+210.562405062" lastFinishedPulling="2026-03-09 09:11:41.600530297 +0000 UTC m=+266.630731049" observedRunningTime="2026-03-09 09:11:42.77179167 +0000 UTC m=+267.801992422" watchObservedRunningTime="2026-03-09 09:11:42.796688002 +0000 UTC m=+267.826888754" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.798457 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8d69864c9-v4m9r"] Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.808795 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.808936 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8d69864c9-v4m9r"] Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.816570 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb"] Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.820292 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f9c9656d9-rkhjb"] Mar 09 09:11:42 crc kubenswrapper[4792]: I0309 09:11:42.878470 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6"] Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.213844 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.213896 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.213939 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.214615 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.214677 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818" gracePeriod=600 Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.372500 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.372557 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.639662 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.639712 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.675724 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c1159c9-ec48-4f11-98d0-893333a427e2" path="/var/lib/kubelet/pods/5c1159c9-ec48-4f11-98d0-893333a427e2/volumes" Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.676608 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9" path="/var/lib/kubelet/pods/fa9edf1d-0ef2-4b57-8cf0-2cf8b3cd1eb9/volumes" Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.736560 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818" exitCode=0 Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.736636 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818"} Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.737882 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" event={"ID":"2366dd4a-7f9d-4c9e-b7fb-aef776932faf","Type":"ContainerStarted","Data":"628508feccc32c2dea25cb0c1c57acda6b760bc9f7463b04e3c1bd7fa3f2cf45"} Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.737907 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" event={"ID":"2366dd4a-7f9d-4c9e-b7fb-aef776932faf","Type":"ContainerStarted","Data":"07be461632f10ba883775ed5aa7c3cf92fbd0dabf57132ac29985aa8064e62a6"} Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.738252 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.743743 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.766102 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" podStartSLOduration=3.766085608 podStartE2EDuration="3.766085608s" podCreationTimestamp="2026-03-09 09:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:11:43.764418079 +0000 UTC m=+268.794618831" watchObservedRunningTime="2026-03-09 09:11:43.766085608 +0000 UTC m=+268.796286360" Mar 09 09:11:43 crc kubenswrapper[4792]: I0309 09:11:43.834646 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.282486 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7766fd6d68-cgzlz"] Mar 09 09:11:44 crc kubenswrapper[4792]: E0309 09:11:44.283124 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c1159c9-ec48-4f11-98d0-893333a427e2" containerName="controller-manager" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.283138 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c1159c9-ec48-4f11-98d0-893333a427e2" containerName="controller-manager" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.283260 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c1159c9-ec48-4f11-98d0-893333a427e2" containerName="controller-manager" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.283774 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.289654 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.289739 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.290142 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.290461 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.290815 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.297369 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.298384 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.311416 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7766fd6d68-cgzlz"] Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.352431 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-client-ca\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.352506 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-proxy-ca-bundles\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.352536 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00ec8adb-4eec-4ea0-b843-628124d00d5c-serving-cert\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.352570 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-config\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.352618 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkc5z\" (UniqueName: \"kubernetes.io/projected/00ec8adb-4eec-4ea0-b843-628124d00d5c-kube-api-access-lkc5z\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.410217 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jsnbn" podUID="0667075c-38b7-4fb6-ad69-a31987eae3cc" containerName="registry-server" probeResult="failure" output=< Mar 09 09:11:44 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 09:11:44 crc kubenswrapper[4792]: > Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.453899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-config\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.453992 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkc5z\" (UniqueName: \"kubernetes.io/projected/00ec8adb-4eec-4ea0-b843-628124d00d5c-kube-api-access-lkc5z\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.454033 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-client-ca\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.454061 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-proxy-ca-bundles\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.454106 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00ec8adb-4eec-4ea0-b843-628124d00d5c-serving-cert\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.455114 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-client-ca\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.455293 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-config\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.456437 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-proxy-ca-bundles\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.466451 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00ec8adb-4eec-4ea0-b843-628124d00d5c-serving-cert\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.479734 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkc5z\" (UniqueName: \"kubernetes.io/projected/00ec8adb-4eec-4ea0-b843-628124d00d5c-kube-api-access-lkc5z\") pod \"controller-manager-7766fd6d68-cgzlz\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.598794 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.688664 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c4nwc" podUID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" containerName="registry-server" probeResult="failure" output=< Mar 09 09:11:44 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 09:11:44 crc kubenswrapper[4792]: > Mar 09 09:11:44 crc kubenswrapper[4792]: I0309 09:11:44.749412 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"72d2b48cb291f76d23e0eb58c24e11749ce55457c1a4bd60f9c4519440ee7870"} Mar 09 09:11:45 crc kubenswrapper[4792]: I0309 09:11:45.436186 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7766fd6d68-cgzlz"] Mar 09 09:11:45 crc kubenswrapper[4792]: W0309 09:11:45.450604 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00ec8adb_4eec_4ea0_b843_628124d00d5c.slice/crio-a7289a697718f73f244bc2bd539c7b13cdf606e4735d07f7c9fc804036dd9605 WatchSource:0}: Error finding container a7289a697718f73f244bc2bd539c7b13cdf606e4735d07f7c9fc804036dd9605: Status 404 returned error can't find the container with id a7289a697718f73f244bc2bd539c7b13cdf606e4735d07f7c9fc804036dd9605 Mar 09 09:11:45 crc kubenswrapper[4792]: I0309 09:11:45.761800 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" event={"ID":"00ec8adb-4eec-4ea0-b843-628124d00d5c","Type":"ContainerStarted","Data":"3ae008b4ec8ccc8a76475bb45bb2b50a3cd71ac70472cdb0a4e1a104bb4d6ec9"} Mar 09 09:11:45 crc kubenswrapper[4792]: I0309 09:11:45.762153 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:45 crc kubenswrapper[4792]: I0309 09:11:45.762164 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" event={"ID":"00ec8adb-4eec-4ea0-b843-628124d00d5c","Type":"ContainerStarted","Data":"a7289a697718f73f244bc2bd539c7b13cdf606e4735d07f7c9fc804036dd9605"} Mar 09 09:11:45 crc kubenswrapper[4792]: I0309 09:11:45.763964 4792 patch_prober.go:28] interesting pod/controller-manager-7766fd6d68-cgzlz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 09 09:11:45 crc kubenswrapper[4792]: I0309 09:11:45.764014 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" podUID="00ec8adb-4eec-4ea0-b843-628124d00d5c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 09 09:11:45 crc kubenswrapper[4792]: I0309 09:11:45.765234 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d7nf" event={"ID":"50c5e5dd-62cf-470c-a626-27cca12c69fb","Type":"ContainerStarted","Data":"e9a9097e4025841cb70ce7ed01eeeb94af5a9d20a2ce07efc77d0825d8efcd3c"} Mar 09 09:11:45 crc kubenswrapper[4792]: I0309 09:11:45.803975 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" podStartSLOduration=5.803953501 podStartE2EDuration="5.803953501s" podCreationTimestamp="2026-03-09 09:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:11:45.78348664 +0000 UTC m=+270.813687392" watchObservedRunningTime="2026-03-09 09:11:45.803953501 +0000 UTC m=+270.834154243" Mar 09 09:11:46 crc kubenswrapper[4792]: I0309 09:11:46.103144 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2d7nf" podStartSLOduration=5.238215957 podStartE2EDuration="1m6.103119141s" podCreationTimestamp="2026-03-09 09:10:40 +0000 UTC" firstStartedPulling="2026-03-09 09:10:44.329896889 +0000 UTC m=+209.360097641" lastFinishedPulling="2026-03-09 09:11:45.194800073 +0000 UTC m=+270.225000825" observedRunningTime="2026-03-09 09:11:45.805758215 +0000 UTC m=+270.835958977" watchObservedRunningTime="2026-03-09 09:11:46.103119141 +0000 UTC m=+271.133319903" Mar 09 09:11:46 crc kubenswrapper[4792]: I0309 09:11:46.107659 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mj542"] Mar 09 09:11:46 crc kubenswrapper[4792]: I0309 09:11:46.782850 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hkmv" event={"ID":"047810b2-277c-4d4c-822a-98b6d2a91fcc","Type":"ContainerStarted","Data":"29997cf552b1be261417eea2595d97e67a0cea55f8ee28656b602e5aee87155c"} Mar 09 09:11:46 crc kubenswrapper[4792]: I0309 09:11:46.783129 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mj542" podUID="a19c063e-191a-487c-a491-0af8c6fc1e3f" containerName="registry-server" containerID="cri-o://7523a1cacb1393a013e37df62e1c59310b854237ad47937a8ff32cd51c6af927" gracePeriod=2 Mar 09 09:11:46 crc kubenswrapper[4792]: I0309 09:11:46.788816 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:11:46 crc kubenswrapper[4792]: I0309 09:11:46.806781 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8hkmv" podStartSLOduration=6.504429425 podStartE2EDuration="1m7.806753448s" podCreationTimestamp="2026-03-09 09:10:39 +0000 UTC" firstStartedPulling="2026-03-09 09:10:44.411044614 +0000 UTC m=+209.441245366" lastFinishedPulling="2026-03-09 09:11:45.713368647 +0000 UTC m=+270.743569389" observedRunningTime="2026-03-09 09:11:46.80204167 +0000 UTC m=+271.832242442" watchObservedRunningTime="2026-03-09 09:11:46.806753448 +0000 UTC m=+271.836954210" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.201947 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.295822 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19c063e-191a-487c-a491-0af8c6fc1e3f-catalog-content\") pod \"a19c063e-191a-487c-a491-0af8c6fc1e3f\" (UID: \"a19c063e-191a-487c-a491-0af8c6fc1e3f\") " Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.295988 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19c063e-191a-487c-a491-0af8c6fc1e3f-utilities\") pod \"a19c063e-191a-487c-a491-0af8c6fc1e3f\" (UID: \"a19c063e-191a-487c-a491-0af8c6fc1e3f\") " Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.296035 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2brw5\" (UniqueName: \"kubernetes.io/projected/a19c063e-191a-487c-a491-0af8c6fc1e3f-kube-api-access-2brw5\") pod \"a19c063e-191a-487c-a491-0af8c6fc1e3f\" (UID: \"a19c063e-191a-487c-a491-0af8c6fc1e3f\") " Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.296735 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a19c063e-191a-487c-a491-0af8c6fc1e3f-utilities" (OuterVolumeSpecName: "utilities") pod "a19c063e-191a-487c-a491-0af8c6fc1e3f" (UID: "a19c063e-191a-487c-a491-0af8c6fc1e3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.303212 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a19c063e-191a-487c-a491-0af8c6fc1e3f-kube-api-access-2brw5" (OuterVolumeSpecName: "kube-api-access-2brw5") pod "a19c063e-191a-487c-a491-0af8c6fc1e3f" (UID: "a19c063e-191a-487c-a491-0af8c6fc1e3f"). InnerVolumeSpecName "kube-api-access-2brw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.326934 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a19c063e-191a-487c-a491-0af8c6fc1e3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a19c063e-191a-487c-a491-0af8c6fc1e3f" (UID: "a19c063e-191a-487c-a491-0af8c6fc1e3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.397878 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19c063e-191a-487c-a491-0af8c6fc1e3f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.398209 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2brw5\" (UniqueName: \"kubernetes.io/projected/a19c063e-191a-487c-a491-0af8c6fc1e3f-kube-api-access-2brw5\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.398278 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19c063e-191a-487c-a491-0af8c6fc1e3f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.790154 4792 generic.go:334] "Generic (PLEG): container finished" podID="a19c063e-191a-487c-a491-0af8c6fc1e3f" containerID="7523a1cacb1393a013e37df62e1c59310b854237ad47937a8ff32cd51c6af927" exitCode=0 Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.790836 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mj542" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.791227 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mj542" event={"ID":"a19c063e-191a-487c-a491-0af8c6fc1e3f","Type":"ContainerDied","Data":"7523a1cacb1393a013e37df62e1c59310b854237ad47937a8ff32cd51c6af927"} Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.791261 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mj542" event={"ID":"a19c063e-191a-487c-a491-0af8c6fc1e3f","Type":"ContainerDied","Data":"acaf5150deb5ad7326c3f17812ce48891e87a8ee247bcf7520ff1399033388d6"} Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.791279 4792 scope.go:117] "RemoveContainer" containerID="7523a1cacb1393a013e37df62e1c59310b854237ad47937a8ff32cd51c6af927" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.807156 4792 scope.go:117] "RemoveContainer" containerID="35117fdc604b3ef1af1034456a1f7bb3d85d4e5dc802e188d8b19425889da322" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.813420 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mj542"] Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.818591 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mj542"] Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.820369 4792 scope.go:117] "RemoveContainer" containerID="0b4bbfbcd3b525fe37edb82d16955bdb0246e5c49e433c4db4ea0b7dd859cdfa" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.841327 4792 scope.go:117] "RemoveContainer" containerID="7523a1cacb1393a013e37df62e1c59310b854237ad47937a8ff32cd51c6af927" Mar 09 09:11:47 crc kubenswrapper[4792]: E0309 09:11:47.842143 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7523a1cacb1393a013e37df62e1c59310b854237ad47937a8ff32cd51c6af927\": container with ID starting with 7523a1cacb1393a013e37df62e1c59310b854237ad47937a8ff32cd51c6af927 not found: ID does not exist" containerID="7523a1cacb1393a013e37df62e1c59310b854237ad47937a8ff32cd51c6af927" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.842189 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7523a1cacb1393a013e37df62e1c59310b854237ad47937a8ff32cd51c6af927"} err="failed to get container status \"7523a1cacb1393a013e37df62e1c59310b854237ad47937a8ff32cd51c6af927\": rpc error: code = NotFound desc = could not find container \"7523a1cacb1393a013e37df62e1c59310b854237ad47937a8ff32cd51c6af927\": container with ID starting with 7523a1cacb1393a013e37df62e1c59310b854237ad47937a8ff32cd51c6af927 not found: ID does not exist" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.842220 4792 scope.go:117] "RemoveContainer" containerID="35117fdc604b3ef1af1034456a1f7bb3d85d4e5dc802e188d8b19425889da322" Mar 09 09:11:47 crc kubenswrapper[4792]: E0309 09:11:47.842777 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35117fdc604b3ef1af1034456a1f7bb3d85d4e5dc802e188d8b19425889da322\": container with ID starting with 35117fdc604b3ef1af1034456a1f7bb3d85d4e5dc802e188d8b19425889da322 not found: ID does not exist" containerID="35117fdc604b3ef1af1034456a1f7bb3d85d4e5dc802e188d8b19425889da322" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.842813 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35117fdc604b3ef1af1034456a1f7bb3d85d4e5dc802e188d8b19425889da322"} err="failed to get container status \"35117fdc604b3ef1af1034456a1f7bb3d85d4e5dc802e188d8b19425889da322\": rpc error: code = NotFound desc = could not find container \"35117fdc604b3ef1af1034456a1f7bb3d85d4e5dc802e188d8b19425889da322\": container with ID starting with 35117fdc604b3ef1af1034456a1f7bb3d85d4e5dc802e188d8b19425889da322 not found: ID does not exist" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.842837 4792 scope.go:117] "RemoveContainer" containerID="0b4bbfbcd3b525fe37edb82d16955bdb0246e5c49e433c4db4ea0b7dd859cdfa" Mar 09 09:11:47 crc kubenswrapper[4792]: E0309 09:11:47.843435 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b4bbfbcd3b525fe37edb82d16955bdb0246e5c49e433c4db4ea0b7dd859cdfa\": container with ID starting with 0b4bbfbcd3b525fe37edb82d16955bdb0246e5c49e433c4db4ea0b7dd859cdfa not found: ID does not exist" containerID="0b4bbfbcd3b525fe37edb82d16955bdb0246e5c49e433c4db4ea0b7dd859cdfa" Mar 09 09:11:47 crc kubenswrapper[4792]: I0309 09:11:47.843459 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b4bbfbcd3b525fe37edb82d16955bdb0246e5c49e433c4db4ea0b7dd859cdfa"} err="failed to get container status \"0b4bbfbcd3b525fe37edb82d16955bdb0246e5c49e433c4db4ea0b7dd859cdfa\": rpc error: code = NotFound desc = could not find container \"0b4bbfbcd3b525fe37edb82d16955bdb0246e5c49e433c4db4ea0b7dd859cdfa\": container with ID starting with 0b4bbfbcd3b525fe37edb82d16955bdb0246e5c49e433c4db4ea0b7dd859cdfa not found: ID does not exist" Mar 09 09:11:49 crc kubenswrapper[4792]: I0309 09:11:49.671624 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a19c063e-191a-487c-a491-0af8c6fc1e3f" path="/var/lib/kubelet/pods/a19c063e-191a-487c-a491-0af8c6fc1e3f/volumes" Mar 09 09:11:50 crc kubenswrapper[4792]: I0309 09:11:50.722342 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:11:50 crc kubenswrapper[4792]: I0309 09:11:50.722954 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:11:50 crc kubenswrapper[4792]: I0309 09:11:50.765466 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:11:50 crc kubenswrapper[4792]: I0309 09:11:50.850034 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:11:50 crc kubenswrapper[4792]: I0309 09:11:50.852718 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:11:50 crc kubenswrapper[4792]: I0309 09:11:50.852822 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:11:50 crc kubenswrapper[4792]: I0309 09:11:50.908553 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:11:50 crc kubenswrapper[4792]: I0309 09:11:50.939367 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:11:50 crc kubenswrapper[4792]: I0309 09:11:50.939429 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:11:50 crc kubenswrapper[4792]: I0309 09:11:50.944155 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:11:50 crc kubenswrapper[4792]: I0309 09:11:50.944448 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:11:50 crc kubenswrapper[4792]: I0309 09:11:50.977948 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:11:50 crc kubenswrapper[4792]: I0309 09:11:50.985878 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:11:51 crc kubenswrapper[4792]: I0309 09:11:51.866655 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:11:51 crc kubenswrapper[4792]: I0309 09:11:51.867523 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:11:51 crc kubenswrapper[4792]: I0309 09:11:51.872620 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:11:53 crc kubenswrapper[4792]: I0309 09:11:53.092430 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2d7nf"] Mar 09 09:11:53 crc kubenswrapper[4792]: I0309 09:11:53.298573 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wtkcv"] Mar 09 09:11:53 crc kubenswrapper[4792]: I0309 09:11:53.419841 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:11:53 crc kubenswrapper[4792]: I0309 09:11:53.485139 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:11:53 crc kubenswrapper[4792]: I0309 09:11:53.691483 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:11:53 crc kubenswrapper[4792]: I0309 09:11:53.752411 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:11:54 crc kubenswrapper[4792]: I0309 09:11:54.833938 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2d7nf" podUID="50c5e5dd-62cf-470c-a626-27cca12c69fb" containerName="registry-server" containerID="cri-o://e9a9097e4025841cb70ce7ed01eeeb94af5a9d20a2ce07efc77d0825d8efcd3c" gracePeriod=2 Mar 09 09:11:54 crc kubenswrapper[4792]: I0309 09:11:54.834289 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wtkcv" podUID="c0500b46-411e-4371-83ae-1b148bf65ba9" containerName="registry-server" containerID="cri-o://287c47b44be20cd8d4219506710d16d385938a92bfdb51b9f77c5d44c8d6190f" gracePeriod=2 Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.343763 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.417850 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhlfl\" (UniqueName: \"kubernetes.io/projected/c0500b46-411e-4371-83ae-1b148bf65ba9-kube-api-access-rhlfl\") pod \"c0500b46-411e-4371-83ae-1b148bf65ba9\" (UID: \"c0500b46-411e-4371-83ae-1b148bf65ba9\") " Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.418436 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0500b46-411e-4371-83ae-1b148bf65ba9-catalog-content\") pod \"c0500b46-411e-4371-83ae-1b148bf65ba9\" (UID: \"c0500b46-411e-4371-83ae-1b148bf65ba9\") " Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.418476 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0500b46-411e-4371-83ae-1b148bf65ba9-utilities\") pod \"c0500b46-411e-4371-83ae-1b148bf65ba9\" (UID: \"c0500b46-411e-4371-83ae-1b148bf65ba9\") " Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.419306 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0500b46-411e-4371-83ae-1b148bf65ba9-utilities" (OuterVolumeSpecName: "utilities") pod "c0500b46-411e-4371-83ae-1b148bf65ba9" (UID: "c0500b46-411e-4371-83ae-1b148bf65ba9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.423058 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0500b46-411e-4371-83ae-1b148bf65ba9-kube-api-access-rhlfl" (OuterVolumeSpecName: "kube-api-access-rhlfl") pod "c0500b46-411e-4371-83ae-1b148bf65ba9" (UID: "c0500b46-411e-4371-83ae-1b148bf65ba9"). InnerVolumeSpecName "kube-api-access-rhlfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.431725 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.476395 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0500b46-411e-4371-83ae-1b148bf65ba9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0500b46-411e-4371-83ae-1b148bf65ba9" (UID: "c0500b46-411e-4371-83ae-1b148bf65ba9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.519361 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twjdp\" (UniqueName: \"kubernetes.io/projected/50c5e5dd-62cf-470c-a626-27cca12c69fb-kube-api-access-twjdp\") pod \"50c5e5dd-62cf-470c-a626-27cca12c69fb\" (UID: \"50c5e5dd-62cf-470c-a626-27cca12c69fb\") " Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.519965 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c5e5dd-62cf-470c-a626-27cca12c69fb-catalog-content\") pod \"50c5e5dd-62cf-470c-a626-27cca12c69fb\" (UID: \"50c5e5dd-62cf-470c-a626-27cca12c69fb\") " Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.520112 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c5e5dd-62cf-470c-a626-27cca12c69fb-utilities\") pod \"50c5e5dd-62cf-470c-a626-27cca12c69fb\" (UID: \"50c5e5dd-62cf-470c-a626-27cca12c69fb\") " Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.520824 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0500b46-411e-4371-83ae-1b148bf65ba9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.520847 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0500b46-411e-4371-83ae-1b148bf65ba9-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.520859 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhlfl\" (UniqueName: \"kubernetes.io/projected/c0500b46-411e-4371-83ae-1b148bf65ba9-kube-api-access-rhlfl\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.521181 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50c5e5dd-62cf-470c-a626-27cca12c69fb-utilities" (OuterVolumeSpecName: "utilities") pod "50c5e5dd-62cf-470c-a626-27cca12c69fb" (UID: "50c5e5dd-62cf-470c-a626-27cca12c69fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.522727 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50c5e5dd-62cf-470c-a626-27cca12c69fb-kube-api-access-twjdp" (OuterVolumeSpecName: "kube-api-access-twjdp") pod "50c5e5dd-62cf-470c-a626-27cca12c69fb" (UID: "50c5e5dd-62cf-470c-a626-27cca12c69fb"). InnerVolumeSpecName "kube-api-access-twjdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.574293 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50c5e5dd-62cf-470c-a626-27cca12c69fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50c5e5dd-62cf-470c-a626-27cca12c69fb" (UID: "50c5e5dd-62cf-470c-a626-27cca12c69fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.621963 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50c5e5dd-62cf-470c-a626-27cca12c69fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.622003 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twjdp\" (UniqueName: \"kubernetes.io/projected/50c5e5dd-62cf-470c-a626-27cca12c69fb-kube-api-access-twjdp\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.622015 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50c5e5dd-62cf-470c-a626-27cca12c69fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.841146 4792 generic.go:334] "Generic (PLEG): container finished" podID="c0500b46-411e-4371-83ae-1b148bf65ba9" containerID="287c47b44be20cd8d4219506710d16d385938a92bfdb51b9f77c5d44c8d6190f" exitCode=0 Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.841180 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wtkcv" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.841203 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtkcv" event={"ID":"c0500b46-411e-4371-83ae-1b148bf65ba9","Type":"ContainerDied","Data":"287c47b44be20cd8d4219506710d16d385938a92bfdb51b9f77c5d44c8d6190f"} Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.841244 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wtkcv" event={"ID":"c0500b46-411e-4371-83ae-1b148bf65ba9","Type":"ContainerDied","Data":"12ddd27962373accb9db70cd2705c95b3bf27cb10beabc533c155430679fbeaa"} Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.841279 4792 scope.go:117] "RemoveContainer" containerID="287c47b44be20cd8d4219506710d16d385938a92bfdb51b9f77c5d44c8d6190f" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.845246 4792 generic.go:334] "Generic (PLEG): container finished" podID="50c5e5dd-62cf-470c-a626-27cca12c69fb" containerID="e9a9097e4025841cb70ce7ed01eeeb94af5a9d20a2ce07efc77d0825d8efcd3c" exitCode=0 Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.845295 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d7nf" event={"ID":"50c5e5dd-62cf-470c-a626-27cca12c69fb","Type":"ContainerDied","Data":"e9a9097e4025841cb70ce7ed01eeeb94af5a9d20a2ce07efc77d0825d8efcd3c"} Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.845328 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2d7nf" event={"ID":"50c5e5dd-62cf-470c-a626-27cca12c69fb","Type":"ContainerDied","Data":"25e168d0cb6ce00f938ad43401b9b7045e4c307aba5fff2ba079da25d5de36e8"} Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.845403 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2d7nf" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.860801 4792 scope.go:117] "RemoveContainer" containerID="2f4dd7cea731820ef28d8f419593689c6c007509f1874234069c84d810af9e79" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.866178 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wtkcv"] Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.870460 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wtkcv"] Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.884585 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2d7nf"] Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.888444 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2d7nf"] Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.890329 4792 scope.go:117] "RemoveContainer" containerID="af22a1a4ae5b3367befd44d8a3b4d571775717fa86727e9398acdd0a48a10cde" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.905606 4792 scope.go:117] "RemoveContainer" containerID="287c47b44be20cd8d4219506710d16d385938a92bfdb51b9f77c5d44c8d6190f" Mar 09 09:11:55 crc kubenswrapper[4792]: E0309 09:11:55.906126 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"287c47b44be20cd8d4219506710d16d385938a92bfdb51b9f77c5d44c8d6190f\": container with ID starting with 287c47b44be20cd8d4219506710d16d385938a92bfdb51b9f77c5d44c8d6190f not found: ID does not exist" containerID="287c47b44be20cd8d4219506710d16d385938a92bfdb51b9f77c5d44c8d6190f" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.906167 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"287c47b44be20cd8d4219506710d16d385938a92bfdb51b9f77c5d44c8d6190f"} err="failed to get container status \"287c47b44be20cd8d4219506710d16d385938a92bfdb51b9f77c5d44c8d6190f\": rpc error: code = NotFound desc = could not find container \"287c47b44be20cd8d4219506710d16d385938a92bfdb51b9f77c5d44c8d6190f\": container with ID starting with 287c47b44be20cd8d4219506710d16d385938a92bfdb51b9f77c5d44c8d6190f not found: ID does not exist" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.906195 4792 scope.go:117] "RemoveContainer" containerID="2f4dd7cea731820ef28d8f419593689c6c007509f1874234069c84d810af9e79" Mar 09 09:11:55 crc kubenswrapper[4792]: E0309 09:11:55.906439 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f4dd7cea731820ef28d8f419593689c6c007509f1874234069c84d810af9e79\": container with ID starting with 2f4dd7cea731820ef28d8f419593689c6c007509f1874234069c84d810af9e79 not found: ID does not exist" containerID="2f4dd7cea731820ef28d8f419593689c6c007509f1874234069c84d810af9e79" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.906466 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4dd7cea731820ef28d8f419593689c6c007509f1874234069c84d810af9e79"} err="failed to get container status \"2f4dd7cea731820ef28d8f419593689c6c007509f1874234069c84d810af9e79\": rpc error: code = NotFound desc = could not find container \"2f4dd7cea731820ef28d8f419593689c6c007509f1874234069c84d810af9e79\": container with ID starting with 2f4dd7cea731820ef28d8f419593689c6c007509f1874234069c84d810af9e79 not found: ID does not exist" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.906483 4792 scope.go:117] "RemoveContainer" containerID="af22a1a4ae5b3367befd44d8a3b4d571775717fa86727e9398acdd0a48a10cde" Mar 09 09:11:55 crc kubenswrapper[4792]: E0309 09:11:55.906740 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af22a1a4ae5b3367befd44d8a3b4d571775717fa86727e9398acdd0a48a10cde\": container with ID starting with af22a1a4ae5b3367befd44d8a3b4d571775717fa86727e9398acdd0a48a10cde not found: ID does not exist" containerID="af22a1a4ae5b3367befd44d8a3b4d571775717fa86727e9398acdd0a48a10cde" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.906768 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af22a1a4ae5b3367befd44d8a3b4d571775717fa86727e9398acdd0a48a10cde"} err="failed to get container status \"af22a1a4ae5b3367befd44d8a3b4d571775717fa86727e9398acdd0a48a10cde\": rpc error: code = NotFound desc = could not find container \"af22a1a4ae5b3367befd44d8a3b4d571775717fa86727e9398acdd0a48a10cde\": container with ID starting with af22a1a4ae5b3367befd44d8a3b4d571775717fa86727e9398acdd0a48a10cde not found: ID does not exist" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.906785 4792 scope.go:117] "RemoveContainer" containerID="e9a9097e4025841cb70ce7ed01eeeb94af5a9d20a2ce07efc77d0825d8efcd3c" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.923617 4792 scope.go:117] "RemoveContainer" containerID="46943d8495d95b424e01e9c9770ab664bb81bcf005af9baf9109c88314ca54c7" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.947316 4792 scope.go:117] "RemoveContainer" containerID="9fdca09e9f110fb0bf660bc1be638e39f2a36e2043b74d947175a185f7cdd292" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.963725 4792 scope.go:117] "RemoveContainer" containerID="e9a9097e4025841cb70ce7ed01eeeb94af5a9d20a2ce07efc77d0825d8efcd3c" Mar 09 09:11:55 crc kubenswrapper[4792]: E0309 09:11:55.965893 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9a9097e4025841cb70ce7ed01eeeb94af5a9d20a2ce07efc77d0825d8efcd3c\": container with ID starting with e9a9097e4025841cb70ce7ed01eeeb94af5a9d20a2ce07efc77d0825d8efcd3c not found: ID does not exist" containerID="e9a9097e4025841cb70ce7ed01eeeb94af5a9d20a2ce07efc77d0825d8efcd3c" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.965953 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9a9097e4025841cb70ce7ed01eeeb94af5a9d20a2ce07efc77d0825d8efcd3c"} err="failed to get container status \"e9a9097e4025841cb70ce7ed01eeeb94af5a9d20a2ce07efc77d0825d8efcd3c\": rpc error: code = NotFound desc = could not find container \"e9a9097e4025841cb70ce7ed01eeeb94af5a9d20a2ce07efc77d0825d8efcd3c\": container with ID starting with e9a9097e4025841cb70ce7ed01eeeb94af5a9d20a2ce07efc77d0825d8efcd3c not found: ID does not exist" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.965985 4792 scope.go:117] "RemoveContainer" containerID="46943d8495d95b424e01e9c9770ab664bb81bcf005af9baf9109c88314ca54c7" Mar 09 09:11:55 crc kubenswrapper[4792]: E0309 09:11:55.967384 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46943d8495d95b424e01e9c9770ab664bb81bcf005af9baf9109c88314ca54c7\": container with ID starting with 46943d8495d95b424e01e9c9770ab664bb81bcf005af9baf9109c88314ca54c7 not found: ID does not exist" containerID="46943d8495d95b424e01e9c9770ab664bb81bcf005af9baf9109c88314ca54c7" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.967468 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46943d8495d95b424e01e9c9770ab664bb81bcf005af9baf9109c88314ca54c7"} err="failed to get container status \"46943d8495d95b424e01e9c9770ab664bb81bcf005af9baf9109c88314ca54c7\": rpc error: code = NotFound desc = could not find container \"46943d8495d95b424e01e9c9770ab664bb81bcf005af9baf9109c88314ca54c7\": container with ID starting with 46943d8495d95b424e01e9c9770ab664bb81bcf005af9baf9109c88314ca54c7 not found: ID does not exist" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.967515 4792 scope.go:117] "RemoveContainer" containerID="9fdca09e9f110fb0bf660bc1be638e39f2a36e2043b74d947175a185f7cdd292" Mar 09 09:11:55 crc kubenswrapper[4792]: E0309 09:11:55.967925 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fdca09e9f110fb0bf660bc1be638e39f2a36e2043b74d947175a185f7cdd292\": container with ID starting with 9fdca09e9f110fb0bf660bc1be638e39f2a36e2043b74d947175a185f7cdd292 not found: ID does not exist" containerID="9fdca09e9f110fb0bf660bc1be638e39f2a36e2043b74d947175a185f7cdd292" Mar 09 09:11:55 crc kubenswrapper[4792]: I0309 09:11:55.967992 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fdca09e9f110fb0bf660bc1be638e39f2a36e2043b74d947175a185f7cdd292"} err="failed to get container status \"9fdca09e9f110fb0bf660bc1be638e39f2a36e2043b74d947175a185f7cdd292\": rpc error: code = NotFound desc = could not find container \"9fdca09e9f110fb0bf660bc1be638e39f2a36e2043b74d947175a185f7cdd292\": container with ID starting with 9fdca09e9f110fb0bf660bc1be638e39f2a36e2043b74d947175a185f7cdd292 not found: ID does not exist" Mar 09 09:11:57 crc kubenswrapper[4792]: I0309 09:11:57.672197 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50c5e5dd-62cf-470c-a626-27cca12c69fb" path="/var/lib/kubelet/pods/50c5e5dd-62cf-470c-a626-27cca12c69fb/volumes" Mar 09 09:11:57 crc kubenswrapper[4792]: I0309 09:11:57.672842 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0500b46-411e-4371-83ae-1b148bf65ba9" path="/var/lib/kubelet/pods/c0500b46-411e-4371-83ae-1b148bf65ba9/volumes" Mar 09 09:11:57 crc kubenswrapper[4792]: I0309 09:11:57.891573 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4nwc"] Mar 09 09:11:57 crc kubenswrapper[4792]: I0309 09:11:57.892194 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c4nwc" podUID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" containerName="registry-server" containerID="cri-o://9b21ea77a7f25dff02c6e7dafdc5b8e63883444513fad376003818c6cfcbfd8a" gracePeriod=2 Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.395186 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.459596 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-catalog-content\") pod \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\" (UID: \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\") " Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.459651 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-utilities\") pod \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\" (UID: \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\") " Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.459675 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64msg\" (UniqueName: \"kubernetes.io/projected/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-kube-api-access-64msg\") pod \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\" (UID: \"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c\") " Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.460824 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-utilities" (OuterVolumeSpecName: "utilities") pod "c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" (UID: "c0aa86a9-9ed9-492f-ac14-43e14abf1f2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.529139 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-kube-api-access-64msg" (OuterVolumeSpecName: "kube-api-access-64msg") pod "c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" (UID: "c0aa86a9-9ed9-492f-ac14-43e14abf1f2c"). InnerVolumeSpecName "kube-api-access-64msg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.561897 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.562148 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64msg\" (UniqueName: \"kubernetes.io/projected/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-kube-api-access-64msg\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.595047 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" (UID: "c0aa86a9-9ed9-492f-ac14-43e14abf1f2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.663601 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.867807 4792 generic.go:334] "Generic (PLEG): container finished" podID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" containerID="9b21ea77a7f25dff02c6e7dafdc5b8e63883444513fad376003818c6cfcbfd8a" exitCode=0 Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.867867 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4nwc" event={"ID":"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c","Type":"ContainerDied","Data":"9b21ea77a7f25dff02c6e7dafdc5b8e63883444513fad376003818c6cfcbfd8a"} Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.867911 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c4nwc" event={"ID":"c0aa86a9-9ed9-492f-ac14-43e14abf1f2c","Type":"ContainerDied","Data":"f02596659545e20ef80e22b683a017f816b7ed2966e0fa65f832fca1d2d8b19a"} Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.867933 4792 scope.go:117] "RemoveContainer" containerID="9b21ea77a7f25dff02c6e7dafdc5b8e63883444513fad376003818c6cfcbfd8a" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.869242 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c4nwc" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.886047 4792 scope.go:117] "RemoveContainer" containerID="c251e1816aa2ae69481d1f29b55021e0a8ebd3eef2d59cafbf8af6292f4cfd00" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.906860 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c4nwc"] Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.915704 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c4nwc"] Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.923480 4792 scope.go:117] "RemoveContainer" containerID="0052bf51352202144c3072e221ebc38403c2456047744c527662d8c2efa0ebb6" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.944138 4792 scope.go:117] "RemoveContainer" containerID="9b21ea77a7f25dff02c6e7dafdc5b8e63883444513fad376003818c6cfcbfd8a" Mar 09 09:11:58 crc kubenswrapper[4792]: E0309 09:11:58.944512 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b21ea77a7f25dff02c6e7dafdc5b8e63883444513fad376003818c6cfcbfd8a\": container with ID starting with 9b21ea77a7f25dff02c6e7dafdc5b8e63883444513fad376003818c6cfcbfd8a not found: ID does not exist" containerID="9b21ea77a7f25dff02c6e7dafdc5b8e63883444513fad376003818c6cfcbfd8a" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.944554 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b21ea77a7f25dff02c6e7dafdc5b8e63883444513fad376003818c6cfcbfd8a"} err="failed to get container status \"9b21ea77a7f25dff02c6e7dafdc5b8e63883444513fad376003818c6cfcbfd8a\": rpc error: code = NotFound desc = could not find container \"9b21ea77a7f25dff02c6e7dafdc5b8e63883444513fad376003818c6cfcbfd8a\": container with ID starting with 9b21ea77a7f25dff02c6e7dafdc5b8e63883444513fad376003818c6cfcbfd8a not found: ID does not exist" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.944589 4792 scope.go:117] "RemoveContainer" containerID="c251e1816aa2ae69481d1f29b55021e0a8ebd3eef2d59cafbf8af6292f4cfd00" Mar 09 09:11:58 crc kubenswrapper[4792]: E0309 09:11:58.944886 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c251e1816aa2ae69481d1f29b55021e0a8ebd3eef2d59cafbf8af6292f4cfd00\": container with ID starting with c251e1816aa2ae69481d1f29b55021e0a8ebd3eef2d59cafbf8af6292f4cfd00 not found: ID does not exist" containerID="c251e1816aa2ae69481d1f29b55021e0a8ebd3eef2d59cafbf8af6292f4cfd00" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.944912 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c251e1816aa2ae69481d1f29b55021e0a8ebd3eef2d59cafbf8af6292f4cfd00"} err="failed to get container status \"c251e1816aa2ae69481d1f29b55021e0a8ebd3eef2d59cafbf8af6292f4cfd00\": rpc error: code = NotFound desc = could not find container \"c251e1816aa2ae69481d1f29b55021e0a8ebd3eef2d59cafbf8af6292f4cfd00\": container with ID starting with c251e1816aa2ae69481d1f29b55021e0a8ebd3eef2d59cafbf8af6292f4cfd00 not found: ID does not exist" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.944928 4792 scope.go:117] "RemoveContainer" containerID="0052bf51352202144c3072e221ebc38403c2456047744c527662d8c2efa0ebb6" Mar 09 09:11:58 crc kubenswrapper[4792]: E0309 09:11:58.945730 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0052bf51352202144c3072e221ebc38403c2456047744c527662d8c2efa0ebb6\": container with ID starting with 0052bf51352202144c3072e221ebc38403c2456047744c527662d8c2efa0ebb6 not found: ID does not exist" containerID="0052bf51352202144c3072e221ebc38403c2456047744c527662d8c2efa0ebb6" Mar 09 09:11:58 crc kubenswrapper[4792]: I0309 09:11:58.945774 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0052bf51352202144c3072e221ebc38403c2456047744c527662d8c2efa0ebb6"} err="failed to get container status \"0052bf51352202144c3072e221ebc38403c2456047744c527662d8c2efa0ebb6\": rpc error: code = NotFound desc = could not find container \"0052bf51352202144c3072e221ebc38403c2456047744c527662d8c2efa0ebb6\": container with ID starting with 0052bf51352202144c3072e221ebc38403c2456047744c527662d8c2efa0ebb6 not found: ID does not exist" Mar 09 09:11:59 crc kubenswrapper[4792]: I0309 09:11:59.675481 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" path="/var/lib/kubelet/pods/c0aa86a9-9ed9-492f-ac14-43e14abf1f2c/volumes" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459210 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550792-cq6k6"] Mar 09 09:12:00 crc kubenswrapper[4792]: E0309 09:12:00.459475 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c5e5dd-62cf-470c-a626-27cca12c69fb" containerName="registry-server" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459491 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c5e5dd-62cf-470c-a626-27cca12c69fb" containerName="registry-server" Mar 09 09:12:00 crc kubenswrapper[4792]: E0309 09:12:00.459503 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0500b46-411e-4371-83ae-1b148bf65ba9" containerName="registry-server" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459510 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0500b46-411e-4371-83ae-1b148bf65ba9" containerName="registry-server" Mar 09 09:12:00 crc kubenswrapper[4792]: E0309 09:12:00.459524 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0500b46-411e-4371-83ae-1b148bf65ba9" containerName="extract-utilities" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459531 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0500b46-411e-4371-83ae-1b148bf65ba9" containerName="extract-utilities" Mar 09 09:12:00 crc kubenswrapper[4792]: E0309 09:12:00.459539 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19c063e-191a-487c-a491-0af8c6fc1e3f" containerName="extract-utilities" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459545 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19c063e-191a-487c-a491-0af8c6fc1e3f" containerName="extract-utilities" Mar 09 09:12:00 crc kubenswrapper[4792]: E0309 09:12:00.459552 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" containerName="registry-server" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459557 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" containerName="registry-server" Mar 09 09:12:00 crc kubenswrapper[4792]: E0309 09:12:00.459566 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19c063e-191a-487c-a491-0af8c6fc1e3f" containerName="extract-content" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459572 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19c063e-191a-487c-a491-0af8c6fc1e3f" containerName="extract-content" Mar 09 09:12:00 crc kubenswrapper[4792]: E0309 09:12:00.459579 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" containerName="extract-utilities" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459584 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" containerName="extract-utilities" Mar 09 09:12:00 crc kubenswrapper[4792]: E0309 09:12:00.459594 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c5e5dd-62cf-470c-a626-27cca12c69fb" containerName="extract-utilities" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459600 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c5e5dd-62cf-470c-a626-27cca12c69fb" containerName="extract-utilities" Mar 09 09:12:00 crc kubenswrapper[4792]: E0309 09:12:00.459609 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" containerName="extract-content" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459616 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" containerName="extract-content" Mar 09 09:12:00 crc kubenswrapper[4792]: E0309 09:12:00.459624 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0500b46-411e-4371-83ae-1b148bf65ba9" containerName="extract-content" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459630 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0500b46-411e-4371-83ae-1b148bf65ba9" containerName="extract-content" Mar 09 09:12:00 crc kubenswrapper[4792]: E0309 09:12:00.459638 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19c063e-191a-487c-a491-0af8c6fc1e3f" containerName="registry-server" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459643 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19c063e-191a-487c-a491-0af8c6fc1e3f" containerName="registry-server" Mar 09 09:12:00 crc kubenswrapper[4792]: E0309 09:12:00.459656 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c5e5dd-62cf-470c-a626-27cca12c69fb" containerName="extract-content" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459663 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c5e5dd-62cf-470c-a626-27cca12c69fb" containerName="extract-content" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459750 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a19c063e-191a-487c-a491-0af8c6fc1e3f" containerName="registry-server" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459761 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="50c5e5dd-62cf-470c-a626-27cca12c69fb" containerName="registry-server" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459771 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0500b46-411e-4371-83ae-1b148bf65ba9" containerName="registry-server" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.459777 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0aa86a9-9ed9-492f-ac14-43e14abf1f2c" containerName="registry-server" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.460218 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.463382 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.463800 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.463925 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.481705 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550792-cq6k6"] Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.590819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhbmj\" (UniqueName: \"kubernetes.io/projected/5fc195fc-d546-4720-9790-ddadfb09282b-kube-api-access-qhbmj\") pod \"auto-csr-approver-29550792-cq6k6\" (UID: \"5fc195fc-d546-4720-9790-ddadfb09282b\") " pod="openshift-infra/auto-csr-approver-29550792-cq6k6" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.691751 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhbmj\" (UniqueName: \"kubernetes.io/projected/5fc195fc-d546-4720-9790-ddadfb09282b-kube-api-access-qhbmj\") pod \"auto-csr-approver-29550792-cq6k6\" (UID: \"5fc195fc-d546-4720-9790-ddadfb09282b\") " pod="openshift-infra/auto-csr-approver-29550792-cq6k6" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.710886 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhbmj\" (UniqueName: \"kubernetes.io/projected/5fc195fc-d546-4720-9790-ddadfb09282b-kube-api-access-qhbmj\") pod \"auto-csr-approver-29550792-cq6k6\" (UID: \"5fc195fc-d546-4720-9790-ddadfb09282b\") " pod="openshift-infra/auto-csr-approver-29550792-cq6k6" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.774114 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.929317 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7766fd6d68-cgzlz"] Mar 09 09:12:00 crc kubenswrapper[4792]: I0309 09:12:00.929797 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" podUID="00ec8adb-4eec-4ea0-b843-628124d00d5c" containerName="controller-manager" containerID="cri-o://3ae008b4ec8ccc8a76475bb45bb2b50a3cd71ac70472cdb0a4e1a104bb4d6ec9" gracePeriod=30 Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.065129 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6"] Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.065373 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" podUID="2366dd4a-7f9d-4c9e-b7fb-aef776932faf" containerName="route-controller-manager" containerID="cri-o://628508feccc32c2dea25cb0c1c57acda6b760bc9f7463b04e3c1bd7fa3f2cf45" gracePeriod=30 Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.150454 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tl5jf"] Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.555747 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550792-cq6k6"] Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.867577 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.871939 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.887551 4792 generic.go:334] "Generic (PLEG): container finished" podID="00ec8adb-4eec-4ea0-b843-628124d00d5c" containerID="3ae008b4ec8ccc8a76475bb45bb2b50a3cd71ac70472cdb0a4e1a104bb4d6ec9" exitCode=0 Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.887628 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.887647 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" event={"ID":"00ec8adb-4eec-4ea0-b843-628124d00d5c","Type":"ContainerDied","Data":"3ae008b4ec8ccc8a76475bb45bb2b50a3cd71ac70472cdb0a4e1a104bb4d6ec9"} Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.887683 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7766fd6d68-cgzlz" event={"ID":"00ec8adb-4eec-4ea0-b843-628124d00d5c","Type":"ContainerDied","Data":"a7289a697718f73f244bc2bd539c7b13cdf606e4735d07f7c9fc804036dd9605"} Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.887701 4792 scope.go:117] "RemoveContainer" containerID="3ae008b4ec8ccc8a76475bb45bb2b50a3cd71ac70472cdb0a4e1a104bb4d6ec9" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.888591 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" event={"ID":"5fc195fc-d546-4720-9790-ddadfb09282b","Type":"ContainerStarted","Data":"44a08052ed392d1cc5aa2c34f02c9a7c2d62e6bd57624feda8c210d03ba4e316"} Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.892411 4792 generic.go:334] "Generic (PLEG): container finished" podID="2366dd4a-7f9d-4c9e-b7fb-aef776932faf" containerID="628508feccc32c2dea25cb0c1c57acda6b760bc9f7463b04e3c1bd7fa3f2cf45" exitCode=0 Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.892458 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" event={"ID":"2366dd4a-7f9d-4c9e-b7fb-aef776932faf","Type":"ContainerDied","Data":"628508feccc32c2dea25cb0c1c57acda6b760bc9f7463b04e3c1bd7fa3f2cf45"} Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.892467 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.892483 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6" event={"ID":"2366dd4a-7f9d-4c9e-b7fb-aef776932faf","Type":"ContainerDied","Data":"07be461632f10ba883775ed5aa7c3cf92fbd0dabf57132ac29985aa8064e62a6"} Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.904779 4792 scope.go:117] "RemoveContainer" containerID="3ae008b4ec8ccc8a76475bb45bb2b50a3cd71ac70472cdb0a4e1a104bb4d6ec9" Mar 09 09:12:01 crc kubenswrapper[4792]: E0309 09:12:01.905330 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae008b4ec8ccc8a76475bb45bb2b50a3cd71ac70472cdb0a4e1a104bb4d6ec9\": container with ID starting with 3ae008b4ec8ccc8a76475bb45bb2b50a3cd71ac70472cdb0a4e1a104bb4d6ec9 not found: ID does not exist" containerID="3ae008b4ec8ccc8a76475bb45bb2b50a3cd71ac70472cdb0a4e1a104bb4d6ec9" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.905382 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae008b4ec8ccc8a76475bb45bb2b50a3cd71ac70472cdb0a4e1a104bb4d6ec9"} err="failed to get container status \"3ae008b4ec8ccc8a76475bb45bb2b50a3cd71ac70472cdb0a4e1a104bb4d6ec9\": rpc error: code = NotFound desc = could not find container \"3ae008b4ec8ccc8a76475bb45bb2b50a3cd71ac70472cdb0a4e1a104bb4d6ec9\": container with ID starting with 3ae008b4ec8ccc8a76475bb45bb2b50a3cd71ac70472cdb0a4e1a104bb4d6ec9 not found: ID does not exist" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.905414 4792 scope.go:117] "RemoveContainer" containerID="628508feccc32c2dea25cb0c1c57acda6b760bc9f7463b04e3c1bd7fa3f2cf45" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.915539 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-proxy-ca-bundles\") pod \"00ec8adb-4eec-4ea0-b843-628124d00d5c\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.915611 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00ec8adb-4eec-4ea0-b843-628124d00d5c-serving-cert\") pod \"00ec8adb-4eec-4ea0-b843-628124d00d5c\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.915670 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-config\") pod \"00ec8adb-4eec-4ea0-b843-628124d00d5c\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.915740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-serving-cert\") pod \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.915761 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkc5z\" (UniqueName: \"kubernetes.io/projected/00ec8adb-4eec-4ea0-b843-628124d00d5c-kube-api-access-lkc5z\") pod \"00ec8adb-4eec-4ea0-b843-628124d00d5c\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.915782 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgrkt\" (UniqueName: \"kubernetes.io/projected/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-kube-api-access-pgrkt\") pod \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.915801 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-client-ca\") pod \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.915820 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-client-ca\") pod \"00ec8adb-4eec-4ea0-b843-628124d00d5c\" (UID: \"00ec8adb-4eec-4ea0-b843-628124d00d5c\") " Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.915837 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-config\") pod \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\" (UID: \"2366dd4a-7f9d-4c9e-b7fb-aef776932faf\") " Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.917058 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-config" (OuterVolumeSpecName: "config") pod "2366dd4a-7f9d-4c9e-b7fb-aef776932faf" (UID: "2366dd4a-7f9d-4c9e-b7fb-aef776932faf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.920902 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "00ec8adb-4eec-4ea0-b843-628124d00d5c" (UID: "00ec8adb-4eec-4ea0-b843-628124d00d5c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.921205 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2366dd4a-7f9d-4c9e-b7fb-aef776932faf" (UID: "2366dd4a-7f9d-4c9e-b7fb-aef776932faf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.921518 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-config" (OuterVolumeSpecName: "config") pod "00ec8adb-4eec-4ea0-b843-628124d00d5c" (UID: "00ec8adb-4eec-4ea0-b843-628124d00d5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.921827 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-client-ca" (OuterVolumeSpecName: "client-ca") pod "00ec8adb-4eec-4ea0-b843-628124d00d5c" (UID: "00ec8adb-4eec-4ea0-b843-628124d00d5c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.922018 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-client-ca" (OuterVolumeSpecName: "client-ca") pod "2366dd4a-7f9d-4c9e-b7fb-aef776932faf" (UID: "2366dd4a-7f9d-4c9e-b7fb-aef776932faf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.924382 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ec8adb-4eec-4ea0-b843-628124d00d5c-kube-api-access-lkc5z" (OuterVolumeSpecName: "kube-api-access-lkc5z") pod "00ec8adb-4eec-4ea0-b843-628124d00d5c" (UID: "00ec8adb-4eec-4ea0-b843-628124d00d5c"). InnerVolumeSpecName "kube-api-access-lkc5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.924730 4792 scope.go:117] "RemoveContainer" containerID="628508feccc32c2dea25cb0c1c57acda6b760bc9f7463b04e3c1bd7fa3f2cf45" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.924745 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ec8adb-4eec-4ea0-b843-628124d00d5c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "00ec8adb-4eec-4ea0-b843-628124d00d5c" (UID: "00ec8adb-4eec-4ea0-b843-628124d00d5c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:12:01 crc kubenswrapper[4792]: E0309 09:12:01.925434 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628508feccc32c2dea25cb0c1c57acda6b760bc9f7463b04e3c1bd7fa3f2cf45\": container with ID starting with 628508feccc32c2dea25cb0c1c57acda6b760bc9f7463b04e3c1bd7fa3f2cf45 not found: ID does not exist" containerID="628508feccc32c2dea25cb0c1c57acda6b760bc9f7463b04e3c1bd7fa3f2cf45" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.925494 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628508feccc32c2dea25cb0c1c57acda6b760bc9f7463b04e3c1bd7fa3f2cf45"} err="failed to get container status \"628508feccc32c2dea25cb0c1c57acda6b760bc9f7463b04e3c1bd7fa3f2cf45\": rpc error: code = NotFound desc = could not find container \"628508feccc32c2dea25cb0c1c57acda6b760bc9f7463b04e3c1bd7fa3f2cf45\": container with ID starting with 628508feccc32c2dea25cb0c1c57acda6b760bc9f7463b04e3c1bd7fa3f2cf45 not found: ID does not exist" Mar 09 09:12:01 crc kubenswrapper[4792]: I0309 09:12:01.932379 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-kube-api-access-pgrkt" (OuterVolumeSpecName: "kube-api-access-pgrkt") pod "2366dd4a-7f9d-4c9e-b7fb-aef776932faf" (UID: "2366dd4a-7f9d-4c9e-b7fb-aef776932faf"). InnerVolumeSpecName "kube-api-access-pgrkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.017619 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.017658 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.017673 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkc5z\" (UniqueName: \"kubernetes.io/projected/00ec8adb-4eec-4ea0-b843-628124d00d5c-kube-api-access-lkc5z\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.017688 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgrkt\" (UniqueName: \"kubernetes.io/projected/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-kube-api-access-pgrkt\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.017701 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.017710 4792 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.017720 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2366dd4a-7f9d-4c9e-b7fb-aef776932faf-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.017728 4792 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00ec8adb-4eec-4ea0-b843-628124d00d5c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.017736 4792 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00ec8adb-4eec-4ea0-b843-628124d00d5c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.220288 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7766fd6d68-cgzlz"] Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.224558 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7766fd6d68-cgzlz"] Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.234552 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6"] Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.238166 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7844b84bbd-6dsp6"] Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.308644 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb"] Mar 09 09:12:02 crc kubenswrapper[4792]: E0309 09:12:02.309032 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2366dd4a-7f9d-4c9e-b7fb-aef776932faf" containerName="route-controller-manager" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.309050 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2366dd4a-7f9d-4c9e-b7fb-aef776932faf" containerName="route-controller-manager" Mar 09 09:12:02 crc kubenswrapper[4792]: E0309 09:12:02.309087 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ec8adb-4eec-4ea0-b843-628124d00d5c" containerName="controller-manager" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.309097 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ec8adb-4eec-4ea0-b843-628124d00d5c" containerName="controller-manager" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.309264 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2366dd4a-7f9d-4c9e-b7fb-aef776932faf" containerName="route-controller-manager" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.309280 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ec8adb-4eec-4ea0-b843-628124d00d5c" containerName="controller-manager" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.309830 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.311953 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d55785845-bn65l"] Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.312674 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.313572 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.313820 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.313868 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.314007 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.314327 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.315002 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.328411 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.329374 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.329459 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.329513 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.329586 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.329728 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.333092 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb"] Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.340994 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.341297 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d55785845-bn65l"] Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.423556 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9c97b47-bf23-43ee-abd3-7fadc26be356-serving-cert\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.423686 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e9c97b47-bf23-43ee-abd3-7fadc26be356-proxy-ca-bundles\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.423774 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9c97b47-bf23-43ee-abd3-7fadc26be356-config\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.423799 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9c97b47-bf23-43ee-abd3-7fadc26be356-client-ca\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.423867 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scd88\" (UniqueName: \"kubernetes.io/projected/9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6-kube-api-access-scd88\") pod \"route-controller-manager-d55785845-bn65l\" (UID: \"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6\") " pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.423897 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6-config\") pod \"route-controller-manager-d55785845-bn65l\" (UID: \"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6\") " pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.423935 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6-client-ca\") pod \"route-controller-manager-d55785845-bn65l\" (UID: \"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6\") " pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.423974 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6-serving-cert\") pod \"route-controller-manager-d55785845-bn65l\" (UID: \"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6\") " pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.424039 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgc55\" (UniqueName: \"kubernetes.io/projected/e9c97b47-bf23-43ee-abd3-7fadc26be356-kube-api-access-vgc55\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.525504 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9c97b47-bf23-43ee-abd3-7fadc26be356-serving-cert\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.526145 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e9c97b47-bf23-43ee-abd3-7fadc26be356-proxy-ca-bundles\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.526175 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9c97b47-bf23-43ee-abd3-7fadc26be356-client-ca\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.526195 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9c97b47-bf23-43ee-abd3-7fadc26be356-config\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.526229 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scd88\" (UniqueName: \"kubernetes.io/projected/9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6-kube-api-access-scd88\") pod \"route-controller-manager-d55785845-bn65l\" (UID: \"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6\") " pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.527154 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9c97b47-bf23-43ee-abd3-7fadc26be356-client-ca\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.527334 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9c97b47-bf23-43ee-abd3-7fadc26be356-config\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.527445 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6-config\") pod \"route-controller-manager-d55785845-bn65l\" (UID: \"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6\") " pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.527506 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6-client-ca\") pod \"route-controller-manager-d55785845-bn65l\" (UID: \"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6\") " pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.527536 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6-serving-cert\") pod \"route-controller-manager-d55785845-bn65l\" (UID: \"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6\") " pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.527565 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e9c97b47-bf23-43ee-abd3-7fadc26be356-proxy-ca-bundles\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.528254 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6-client-ca\") pod \"route-controller-manager-d55785845-bn65l\" (UID: \"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6\") " pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.528450 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6-config\") pod \"route-controller-manager-d55785845-bn65l\" (UID: \"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6\") " pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.528627 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgc55\" (UniqueName: \"kubernetes.io/projected/e9c97b47-bf23-43ee-abd3-7fadc26be356-kube-api-access-vgc55\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.531275 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9c97b47-bf23-43ee-abd3-7fadc26be356-serving-cert\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.531761 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6-serving-cert\") pod \"route-controller-manager-d55785845-bn65l\" (UID: \"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6\") " pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.546568 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scd88\" (UniqueName: \"kubernetes.io/projected/9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6-kube-api-access-scd88\") pod \"route-controller-manager-d55785845-bn65l\" (UID: \"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6\") " pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.547175 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgc55\" (UniqueName: \"kubernetes.io/projected/e9c97b47-bf23-43ee-abd3-7fadc26be356-kube-api-access-vgc55\") pod \"controller-manager-85bb5f6cbc-nkxhb\" (UID: \"e9c97b47-bf23-43ee-abd3-7fadc26be356\") " pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.624041 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.630799 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.905919 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" event={"ID":"5fc195fc-d546-4720-9790-ddadfb09282b","Type":"ContainerStarted","Data":"5f3d93339ac5aafeedc73bb985022ff918ac4be8b6414b466117da7a298b837b"} Mar 09 09:12:02 crc kubenswrapper[4792]: I0309 09:12:02.925961 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" podStartSLOduration=2.166526518 podStartE2EDuration="2.925945787s" podCreationTimestamp="2026-03-09 09:12:00 +0000 UTC" firstStartedPulling="2026-03-09 09:12:01.587811515 +0000 UTC m=+286.618012267" lastFinishedPulling="2026-03-09 09:12:02.347230784 +0000 UTC m=+287.377431536" observedRunningTime="2026-03-09 09:12:02.922551636 +0000 UTC m=+287.952752388" watchObservedRunningTime="2026-03-09 09:12:02.925945787 +0000 UTC m=+287.956146539" Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.013176 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb"] Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.143852 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d55785845-bn65l"] Mar 09 09:12:03 crc kubenswrapper[4792]: W0309 09:12:03.148291 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ac91a7a_5cbe_4d06_99a8_5e446e5b20a6.slice/crio-51f87896a421f7304fc44a84537b96124a9b593690309e35bdc7f4b068127a13 WatchSource:0}: Error finding container 51f87896a421f7304fc44a84537b96124a9b593690309e35bdc7f4b068127a13: Status 404 returned error can't find the container with id 51f87896a421f7304fc44a84537b96124a9b593690309e35bdc7f4b068127a13 Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.669541 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ec8adb-4eec-4ea0-b843-628124d00d5c" path="/var/lib/kubelet/pods/00ec8adb-4eec-4ea0-b843-628124d00d5c/volumes" Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.671709 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2366dd4a-7f9d-4c9e-b7fb-aef776932faf" path="/var/lib/kubelet/pods/2366dd4a-7f9d-4c9e-b7fb-aef776932faf/volumes" Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.914335 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" event={"ID":"e9c97b47-bf23-43ee-abd3-7fadc26be356","Type":"ContainerStarted","Data":"b1b53490595af7f0e6983cf1296ac8b15c1241f55d7063722b73b0806be7fe00"} Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.914398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" event={"ID":"e9c97b47-bf23-43ee-abd3-7fadc26be356","Type":"ContainerStarted","Data":"c3d3d0c862e0c3d2f95cb08b83a010ca4643cebb3bbd32743099a05cffdb45ac"} Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.914864 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.918544 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" event={"ID":"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6","Type":"ContainerStarted","Data":"d999ab18c53a867774ff6c745f2e0b1f1c2ea11e8e4b90e260ef06778c3dd0bf"} Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.918584 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" event={"ID":"9ac91a7a-5cbe-4d06-99a8-5e446e5b20a6","Type":"ContainerStarted","Data":"51f87896a421f7304fc44a84537b96124a9b593690309e35bdc7f4b068127a13"} Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.918729 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.920611 4792 generic.go:334] "Generic (PLEG): container finished" podID="5fc195fc-d546-4720-9790-ddadfb09282b" containerID="5f3d93339ac5aafeedc73bb985022ff918ac4be8b6414b466117da7a298b837b" exitCode=0 Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.920656 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" event={"ID":"5fc195fc-d546-4720-9790-ddadfb09282b","Type":"ContainerDied","Data":"5f3d93339ac5aafeedc73bb985022ff918ac4be8b6414b466117da7a298b837b"} Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.923555 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" Mar 09 09:12:03 crc kubenswrapper[4792]: I0309 09:12:03.966275 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-85bb5f6cbc-nkxhb" podStartSLOduration=3.9662452569999997 podStartE2EDuration="3.966245257s" podCreationTimestamp="2026-03-09 09:12:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:12:03.942735675 +0000 UTC m=+288.972936417" watchObservedRunningTime="2026-03-09 09:12:03.966245257 +0000 UTC m=+288.996446009" Mar 09 09:12:04 crc kubenswrapper[4792]: I0309 09:12:04.023183 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" podStartSLOduration=3.023160391 podStartE2EDuration="3.023160391s" podCreationTimestamp="2026-03-09 09:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:12:04.000469243 +0000 UTC m=+289.030670005" watchObservedRunningTime="2026-03-09 09:12:04.023160391 +0000 UTC m=+289.053361143" Mar 09 09:12:04 crc kubenswrapper[4792]: I0309 09:12:04.166020 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d55785845-bn65l" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.196133 4792 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.197007 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.197722 4792 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.197942 4792 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.197994 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41" gracePeriod=15 Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198193 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc" gracePeriod=15 Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198234 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71" gracePeriod=15 Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198273 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409" gracePeriod=15 Mar 09 09:12:05 crc kubenswrapper[4792]: E0309 09:12:05.198295 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198303 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf" gracePeriod=15 Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198310 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: E0309 09:12:05.198325 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198331 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: E0309 09:12:05.198349 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198359 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 09:12:05 crc kubenswrapper[4792]: E0309 09:12:05.198378 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198388 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 09:12:05 crc kubenswrapper[4792]: E0309 09:12:05.198407 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198416 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: E0309 09:12:05.198429 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198437 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 09:12:05 crc kubenswrapper[4792]: E0309 09:12:05.198448 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198455 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 09:12:05 crc kubenswrapper[4792]: E0309 09:12:05.198462 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198473 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198585 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198598 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198607 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198614 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198622 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198632 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198639 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 09:12:05 crc kubenswrapper[4792]: E0309 09:12:05.198733 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198739 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: E0309 09:12:05.198748 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198755 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.198840 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.199106 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.258566 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.271111 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.271181 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.271204 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.271231 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.271256 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.271270 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.271287 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.271307 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.295806 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.296592 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.296911 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.297307 4792 status_manager.go:851] "Failed to get status for pod" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550792-cq6k6\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.372948 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhbmj\" (UniqueName: \"kubernetes.io/projected/5fc195fc-d546-4720-9790-ddadfb09282b-kube-api-access-qhbmj\") pod \"5fc195fc-d546-4720-9790-ddadfb09282b\" (UID: \"5fc195fc-d546-4720-9790-ddadfb09282b\") " Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373473 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373523 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373565 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373578 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373629 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373662 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373688 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373765 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373768 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373798 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373710 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373780 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373829 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373733 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.373899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.374040 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.379608 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fc195fc-d546-4720-9790-ddadfb09282b-kube-api-access-qhbmj" (OuterVolumeSpecName: "kube-api-access-qhbmj") pod "5fc195fc-d546-4720-9790-ddadfb09282b" (UID: "5fc195fc-d546-4720-9790-ddadfb09282b"). InnerVolumeSpecName "kube-api-access-qhbmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.476047 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhbmj\" (UniqueName: \"kubernetes.io/projected/5fc195fc-d546-4720-9790-ddadfb09282b-kube-api-access-qhbmj\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.562472 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:05 crc kubenswrapper[4792]: E0309 09:12:05.596241 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.200:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b21549d0021b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:12:05.595423158 +0000 UTC m=+290.625623910,LastTimestamp:2026-03-09 09:12:05.595423158 +0000 UTC m=+290.625623910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.665409 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.665727 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.665929 4792 status_manager.go:851] "Failed to get status for pod" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550792-cq6k6\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: E0309 09:12:05.915514 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.200:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b21549d0021b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:12:05.595423158 +0000 UTC m=+290.625623910,LastTimestamp:2026-03-09 09:12:05.595423158 +0000 UTC m=+290.625623910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.935650 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3e73881ba97f1cf0bce321ba3597901e71ca01a914dc944fd4d20800266bad17"} Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.935703 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"188e1a8c19b1b7cd561b5e178dde71d6e4b79cd8471c411888e66d434475ded5"} Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.936489 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.937195 4792 status_manager.go:851] "Failed to get status for pod" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550792-cq6k6\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.939520 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" event={"ID":"5fc195fc-d546-4720-9790-ddadfb09282b","Type":"ContainerDied","Data":"44a08052ed392d1cc5aa2c34f02c9a7c2d62e6bd57624feda8c210d03ba4e316"} Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.939567 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44a08052ed392d1cc5aa2c34f02c9a7c2d62e6bd57624feda8c210d03ba4e316" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.939640 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.942561 4792 generic.go:334] "Generic (PLEG): container finished" podID="8a30a83e-e71a-41e6-8946-34b1a6100a67" containerID="db1b80fa0eb69522fef8d9cb45d09c0c9fcf92dd3f0839f56ab32457397e3d50" exitCode=0 Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.942664 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8a30a83e-e71a-41e6-8946-34b1a6100a67","Type":"ContainerDied","Data":"db1b80fa0eb69522fef8d9cb45d09c0c9fcf92dd3f0839f56ab32457397e3d50"} Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.943493 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.943847 4792 status_manager.go:851] "Failed to get status for pod" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550792-cq6k6\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.944212 4792 status_manager.go:851] "Failed to get status for pod" podUID="8a30a83e-e71a-41e6-8946-34b1a6100a67" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.944546 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.944820 4792 status_manager.go:851] "Failed to get status for pod" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550792-cq6k6\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.945216 4792 status_manager.go:851] "Failed to get status for pod" podUID="8a30a83e-e71a-41e6-8946-34b1a6100a67" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.946555 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.948218 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.949213 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc" exitCode=0 Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.949239 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71" exitCode=0 Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.949246 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409" exitCode=0 Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.949254 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf" exitCode=2 Mar 09 09:12:05 crc kubenswrapper[4792]: I0309 09:12:05.949307 4792 scope.go:117] "RemoveContainer" containerID="420959a5229bf4ed5e6b94cf4a6685b96e4b50d13055cbccf66b6188e47bc770" Mar 09 09:12:06 crc kubenswrapper[4792]: I0309 09:12:06.963000 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.387831 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.389089 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.389475 4792 status_manager.go:851] "Failed to get status for pod" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550792-cq6k6\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.393756 4792 status_manager.go:851] "Failed to get status for pod" podUID="8a30a83e-e71a-41e6-8946-34b1a6100a67" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.510783 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a30a83e-e71a-41e6-8946-34b1a6100a67-kube-api-access\") pod \"8a30a83e-e71a-41e6-8946-34b1a6100a67\" (UID: \"8a30a83e-e71a-41e6-8946-34b1a6100a67\") " Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.510847 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8a30a83e-e71a-41e6-8946-34b1a6100a67-kubelet-dir\") pod \"8a30a83e-e71a-41e6-8946-34b1a6100a67\" (UID: \"8a30a83e-e71a-41e6-8946-34b1a6100a67\") " Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.510885 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8a30a83e-e71a-41e6-8946-34b1a6100a67-var-lock\") pod \"8a30a83e-e71a-41e6-8946-34b1a6100a67\" (UID: \"8a30a83e-e71a-41e6-8946-34b1a6100a67\") " Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.510971 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a30a83e-e71a-41e6-8946-34b1a6100a67-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8a30a83e-e71a-41e6-8946-34b1a6100a67" (UID: "8a30a83e-e71a-41e6-8946-34b1a6100a67"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.511033 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a30a83e-e71a-41e6-8946-34b1a6100a67-var-lock" (OuterVolumeSpecName: "var-lock") pod "8a30a83e-e71a-41e6-8946-34b1a6100a67" (UID: "8a30a83e-e71a-41e6-8946-34b1a6100a67"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.511287 4792 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8a30a83e-e71a-41e6-8946-34b1a6100a67-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.511308 4792 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8a30a83e-e71a-41e6-8946-34b1a6100a67-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.519999 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a30a83e-e71a-41e6-8946-34b1a6100a67-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8a30a83e-e71a-41e6-8946-34b1a6100a67" (UID: "8a30a83e-e71a-41e6-8946-34b1a6100a67"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.613295 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a30a83e-e71a-41e6-8946-34b1a6100a67-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.671602 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.672756 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.674023 4792 status_manager.go:851] "Failed to get status for pod" podUID="8a30a83e-e71a-41e6-8946-34b1a6100a67" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.674875 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.675229 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.676052 4792 status_manager.go:851] "Failed to get status for pod" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550792-cq6k6\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.714027 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.714146 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.714275 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.714297 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.714362 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.714430 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.714764 4792 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.714784 4792 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.714793 4792 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.972864 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.973581 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41" exitCode=0 Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.973734 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.973736 4792 scope.go:117] "RemoveContainer" containerID="53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.975271 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8a30a83e-e71a-41e6-8946-34b1a6100a67","Type":"ContainerDied","Data":"4a75621ef30a115b54564406f512218bc54adfca3c19749b53ae1808ed40f32e"} Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.975301 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a75621ef30a115b54564406f512218bc54adfca3c19749b53ae1808ed40f32e" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.975379 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.979015 4792 status_manager.go:851] "Failed to get status for pod" podUID="8a30a83e-e71a-41e6-8946-34b1a6100a67" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.979230 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.979447 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.979950 4792 status_manager.go:851] "Failed to get status for pod" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550792-cq6k6\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:07 crc kubenswrapper[4792]: I0309 09:12:07.988785 4792 scope.go:117] "RemoveContainer" containerID="9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.000100 4792 status_manager.go:851] "Failed to get status for pod" podUID="8a30a83e-e71a-41e6-8946-34b1a6100a67" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.000434 4792 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.000686 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.000890 4792 status_manager.go:851] "Failed to get status for pod" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550792-cq6k6\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.001888 4792 scope.go:117] "RemoveContainer" containerID="314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.015499 4792 scope.go:117] "RemoveContainer" containerID="226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.031723 4792 scope.go:117] "RemoveContainer" containerID="c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.043303 4792 scope.go:117] "RemoveContainer" containerID="1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.059396 4792 scope.go:117] "RemoveContainer" containerID="53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc" Mar 09 09:12:08 crc kubenswrapper[4792]: E0309 09:12:08.059811 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc\": container with ID starting with 53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc not found: ID does not exist" containerID="53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.059843 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc"} err="failed to get container status \"53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc\": rpc error: code = NotFound desc = could not find container \"53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc\": container with ID starting with 53c53d788666585dd6b1fb214b1f79df5bffadc17ff401b1bed44c93e41258dc not found: ID does not exist" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.059865 4792 scope.go:117] "RemoveContainer" containerID="9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71" Mar 09 09:12:08 crc kubenswrapper[4792]: E0309 09:12:08.060308 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\": container with ID starting with 9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71 not found: ID does not exist" containerID="9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.060363 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71"} err="failed to get container status \"9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\": rpc error: code = NotFound desc = could not find container \"9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71\": container with ID starting with 9cf484f9a832b0e147a17ec53e64cfcda5e37f8bf1f764ddc35215a079994b71 not found: ID does not exist" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.060403 4792 scope.go:117] "RemoveContainer" containerID="314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409" Mar 09 09:12:08 crc kubenswrapper[4792]: E0309 09:12:08.060674 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\": container with ID starting with 314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409 not found: ID does not exist" containerID="314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.060705 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409"} err="failed to get container status \"314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\": rpc error: code = NotFound desc = could not find container \"314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409\": container with ID starting with 314838f53bc19a9f3eb7fd9d3f5473b23a177f2a3068d91f1b0420c27910d409 not found: ID does not exist" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.060724 4792 scope.go:117] "RemoveContainer" containerID="226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf" Mar 09 09:12:08 crc kubenswrapper[4792]: E0309 09:12:08.060965 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\": container with ID starting with 226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf not found: ID does not exist" containerID="226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.060989 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf"} err="failed to get container status \"226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\": rpc error: code = NotFound desc = could not find container \"226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf\": container with ID starting with 226eecaea6fec5a3ae93063702c719edf3908636a2862b9f50a874f494a19ccf not found: ID does not exist" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.061008 4792 scope.go:117] "RemoveContainer" containerID="c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41" Mar 09 09:12:08 crc kubenswrapper[4792]: E0309 09:12:08.061277 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\": container with ID starting with c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41 not found: ID does not exist" containerID="c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.061301 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41"} err="failed to get container status \"c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\": rpc error: code = NotFound desc = could not find container \"c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41\": container with ID starting with c1070465f72d99ed22e913112259837db7d789c0a072b40956088f4a70162c41 not found: ID does not exist" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.061314 4792 scope.go:117] "RemoveContainer" containerID="1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571" Mar 09 09:12:08 crc kubenswrapper[4792]: E0309 09:12:08.061573 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\": container with ID starting with 1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571 not found: ID does not exist" containerID="1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571" Mar 09 09:12:08 crc kubenswrapper[4792]: I0309 09:12:08.061594 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571"} err="failed to get container status \"1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\": rpc error: code = NotFound desc = could not find container \"1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571\": container with ID starting with 1e250b106997f151ae9c435aca2ab3d3d821f40e826afa9ff744443a6b808571 not found: ID does not exist" Mar 09 09:12:09 crc kubenswrapper[4792]: I0309 09:12:09.669531 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 09 09:12:10 crc kubenswrapper[4792]: E0309 09:12:10.819056 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:10 crc kubenswrapper[4792]: E0309 09:12:10.819921 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:10 crc kubenswrapper[4792]: E0309 09:12:10.820477 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:10 crc kubenswrapper[4792]: E0309 09:12:10.820707 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:10 crc kubenswrapper[4792]: E0309 09:12:10.820873 4792 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:10 crc kubenswrapper[4792]: I0309 09:12:10.820900 4792 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 09 09:12:10 crc kubenswrapper[4792]: E0309 09:12:10.821081 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="200ms" Mar 09 09:12:11 crc kubenswrapper[4792]: E0309 09:12:11.021903 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="400ms" Mar 09 09:12:11 crc kubenswrapper[4792]: E0309 09:12:11.422541 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="800ms" Mar 09 09:12:12 crc kubenswrapper[4792]: E0309 09:12:12.224552 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="1.6s" Mar 09 09:12:13 crc kubenswrapper[4792]: E0309 09:12:13.825609 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="3.2s" Mar 09 09:12:15 crc kubenswrapper[4792]: I0309 09:12:15.672762 4792 status_manager.go:851] "Failed to get status for pod" podUID="8a30a83e-e71a-41e6-8946-34b1a6100a67" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:15 crc kubenswrapper[4792]: I0309 09:12:15.680505 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:15 crc kubenswrapper[4792]: I0309 09:12:15.681144 4792 status_manager.go:851] "Failed to get status for pod" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550792-cq6k6\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:15 crc kubenswrapper[4792]: E0309 09:12:15.917241 4792 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.200:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b21549d0021b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:12:05.595423158 +0000 UTC m=+290.625623910,LastTimestamp:2026-03-09 09:12:05.595423158 +0000 UTC m=+290.625623910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:12:17 crc kubenswrapper[4792]: E0309 09:12:17.026241 4792 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.200:6443: connect: connection refused" interval="6.4s" Mar 09 09:12:18 crc kubenswrapper[4792]: I0309 09:12:18.662140 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:18 crc kubenswrapper[4792]: I0309 09:12:18.663484 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:18 crc kubenswrapper[4792]: I0309 09:12:18.663957 4792 status_manager.go:851] "Failed to get status for pod" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550792-cq6k6\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:18 crc kubenswrapper[4792]: I0309 09:12:18.664337 4792 status_manager.go:851] "Failed to get status for pod" podUID="8a30a83e-e71a-41e6-8946-34b1a6100a67" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:18 crc kubenswrapper[4792]: I0309 09:12:18.674680 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7e46817-10cf-448c-8a2a-154f1c322ce6" Mar 09 09:12:18 crc kubenswrapper[4792]: I0309 09:12:18.674710 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7e46817-10cf-448c-8a2a-154f1c322ce6" Mar 09 09:12:18 crc kubenswrapper[4792]: E0309 09:12:18.675147 4792 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:18 crc kubenswrapper[4792]: I0309 09:12:18.675571 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:18 crc kubenswrapper[4792]: W0309 09:12:18.697941 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-0d8e6e5f8cb5e5afe7c64997a527141ef4b5bc705a8bd224351d383041701089 WatchSource:0}: Error finding container 0d8e6e5f8cb5e5afe7c64997a527141ef4b5bc705a8bd224351d383041701089: Status 404 returned error can't find the container with id 0d8e6e5f8cb5e5afe7c64997a527141ef4b5bc705a8bd224351d383041701089 Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.054385 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.055011 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.055065 4792 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae" exitCode=1 Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.055170 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae"} Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.055793 4792 scope.go:117] "RemoveContainer" containerID="82feb47e68b8db8323ed2c02d83e92016fa30e024581d7e361ba07a08919e2ae" Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.056851 4792 status_manager.go:851] "Failed to get status for pod" podUID="8a30a83e-e71a-41e6-8946-34b1a6100a67" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.057344 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.057676 4792 status_manager.go:851] "Failed to get status for pod" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550792-cq6k6\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.058190 4792 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.058286 4792 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="0b6f118d34bc76e380b54a38bf6d9aa0296e7b07803a5685df65383302f05e0f" exitCode=0 Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.058369 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"0b6f118d34bc76e380b54a38bf6d9aa0296e7b07803a5685df65383302f05e0f"} Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.058415 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0d8e6e5f8cb5e5afe7c64997a527141ef4b5bc705a8bd224351d383041701089"} Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.058842 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7e46817-10cf-448c-8a2a-154f1c322ce6" Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.058907 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7e46817-10cf-448c-8a2a-154f1c322ce6" Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.059427 4792 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:19 crc kubenswrapper[4792]: E0309 09:12:19.059494 4792 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.059735 4792 status_manager.go:851] "Failed to get status for pod" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" pod="openshift-infra/auto-csr-approver-29550792-cq6k6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29550792-cq6k6\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.060214 4792 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:19 crc kubenswrapper[4792]: I0309 09:12:19.060649 4792 status_manager.go:851] "Failed to get status for pod" podUID="8a30a83e-e71a-41e6-8946-34b1a6100a67" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.200:6443: connect: connection refused" Mar 09 09:12:20 crc kubenswrapper[4792]: I0309 09:12:20.068921 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 09:12:20 crc kubenswrapper[4792]: I0309 09:12:20.071377 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 09:12:20 crc kubenswrapper[4792]: I0309 09:12:20.071441 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6e22f57faf2f2edb4087434aed61650b0abe488fb0353135db83abe4a685b353"} Mar 09 09:12:20 crc kubenswrapper[4792]: I0309 09:12:20.075658 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1e116d6e363fa7982545a03a7889529596d5d81df8804d5972a052aff816f202"} Mar 09 09:12:20 crc kubenswrapper[4792]: I0309 09:12:20.075702 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"65057111fa824ee61ab4f11eb1c8e21723447227ab4c57889cd2ce0ee9b84831"} Mar 09 09:12:20 crc kubenswrapper[4792]: I0309 09:12:20.075716 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a651d2ca9051405b28be3f99ec93a363a804a2995f26b2c22b8262bc551b39b3"} Mar 09 09:12:20 crc kubenswrapper[4792]: I0309 09:12:20.075728 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"aa434f9de83bbc23a6e7abd7786d0d5890aa21fad52324db60ba4a44a8628425"} Mar 09 09:12:21 crc kubenswrapper[4792]: I0309 09:12:21.085423 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bb46d55c6dd14dfd124a24d093a4a2661ff2634d69a6cabd70de1d49d433149a"} Mar 09 09:12:21 crc kubenswrapper[4792]: I0309 09:12:21.085638 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:21 crc kubenswrapper[4792]: I0309 09:12:21.085717 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7e46817-10cf-448c-8a2a-154f1c322ce6" Mar 09 09:12:21 crc kubenswrapper[4792]: I0309 09:12:21.085742 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7e46817-10cf-448c-8a2a-154f1c322ce6" Mar 09 09:12:23 crc kubenswrapper[4792]: I0309 09:12:23.676203 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:23 crc kubenswrapper[4792]: I0309 09:12:23.676551 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:23 crc kubenswrapper[4792]: I0309 09:12:23.682652 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:25 crc kubenswrapper[4792]: I0309 09:12:25.375181 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:12:25 crc kubenswrapper[4792]: I0309 09:12:25.379209 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:12:25 crc kubenswrapper[4792]: I0309 09:12:25.968606 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.094240 4792 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.112405 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7e46817-10cf-448c-8a2a-154f1c322ce6" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.112438 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7e46817-10cf-448c-8a2a-154f1c322ce6" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.125842 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.129353 4792 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a182a6f5-dc77-47a5-838b-af497ad11af6" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.200268 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" podUID="adf7cc5f-5027-4382-bac8-ed4f459fe424" containerName="oauth-openshift" containerID="cri-o://df541305cc4f686f0e6f186740638f6e30455f4640fd7094d291d655a91ef57a" gracePeriod=15 Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.710483 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.783788 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-idp-0-file-data\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.783843 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-trusted-ca-bundle\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.783868 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-session\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.783918 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-error\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.783945 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-login\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.783965 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-router-certs\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.783990 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvkrg\" (UniqueName: \"kubernetes.io/projected/adf7cc5f-5027-4382-bac8-ed4f459fe424-kube-api-access-wvkrg\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.784015 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-audit-policies\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.784037 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-service-ca\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.784084 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-provider-selection\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.784110 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-cliconfig\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.784126 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/adf7cc5f-5027-4382-bac8-ed4f459fe424-audit-dir\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.784175 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-ocp-branding-template\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.784202 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-serving-cert\") pod \"adf7cc5f-5027-4382-bac8-ed4f459fe424\" (UID: \"adf7cc5f-5027-4382-bac8-ed4f459fe424\") " Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.785463 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf7cc5f-5027-4382-bac8-ed4f459fe424-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.786596 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.786626 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.786667 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.787196 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.792655 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.793043 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf7cc5f-5027-4382-bac8-ed4f459fe424-kube-api-access-wvkrg" (OuterVolumeSpecName: "kube-api-access-wvkrg") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "kube-api-access-wvkrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.794623 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.794696 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.795051 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.796109 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.796736 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.797043 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.800004 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "adf7cc5f-5027-4382-bac8-ed4f459fe424" (UID: "adf7cc5f-5027-4382-bac8-ed4f459fe424"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887040 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887088 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvkrg\" (UniqueName: \"kubernetes.io/projected/adf7cc5f-5027-4382-bac8-ed4f459fe424-kube-api-access-wvkrg\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887099 4792 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887111 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887122 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887132 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887143 4792 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/adf7cc5f-5027-4382-bac8-ed4f459fe424-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887152 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887161 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887172 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887180 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887189 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887198 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:26 crc kubenswrapper[4792]: I0309 09:12:26.887205 4792 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/adf7cc5f-5027-4382-bac8-ed4f459fe424-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:27 crc kubenswrapper[4792]: I0309 09:12:27.118900 4792 generic.go:334] "Generic (PLEG): container finished" podID="adf7cc5f-5027-4382-bac8-ed4f459fe424" containerID="df541305cc4f686f0e6f186740638f6e30455f4640fd7094d291d655a91ef57a" exitCode=0 Mar 09 09:12:27 crc kubenswrapper[4792]: I0309 09:12:27.118978 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" Mar 09 09:12:27 crc kubenswrapper[4792]: I0309 09:12:27.118997 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" event={"ID":"adf7cc5f-5027-4382-bac8-ed4f459fe424","Type":"ContainerDied","Data":"df541305cc4f686f0e6f186740638f6e30455f4640fd7094d291d655a91ef57a"} Mar 09 09:12:27 crc kubenswrapper[4792]: I0309 09:12:27.119052 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tl5jf" event={"ID":"adf7cc5f-5027-4382-bac8-ed4f459fe424","Type":"ContainerDied","Data":"9405041339e4573a3ac71f68d5a4b4c87a47d2f1689e2c7a912e5e916b4ea2d9"} Mar 09 09:12:27 crc kubenswrapper[4792]: I0309 09:12:27.119109 4792 scope.go:117] "RemoveContainer" containerID="df541305cc4f686f0e6f186740638f6e30455f4640fd7094d291d655a91ef57a" Mar 09 09:12:27 crc kubenswrapper[4792]: I0309 09:12:27.119529 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7e46817-10cf-448c-8a2a-154f1c322ce6" Mar 09 09:12:27 crc kubenswrapper[4792]: I0309 09:12:27.119553 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7e46817-10cf-448c-8a2a-154f1c322ce6" Mar 09 09:12:27 crc kubenswrapper[4792]: I0309 09:12:27.144505 4792 scope.go:117] "RemoveContainer" containerID="df541305cc4f686f0e6f186740638f6e30455f4640fd7094d291d655a91ef57a" Mar 09 09:12:27 crc kubenswrapper[4792]: E0309 09:12:27.145051 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df541305cc4f686f0e6f186740638f6e30455f4640fd7094d291d655a91ef57a\": container with ID starting with df541305cc4f686f0e6f186740638f6e30455f4640fd7094d291d655a91ef57a not found: ID does not exist" containerID="df541305cc4f686f0e6f186740638f6e30455f4640fd7094d291d655a91ef57a" Mar 09 09:12:27 crc kubenswrapper[4792]: I0309 09:12:27.145116 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df541305cc4f686f0e6f186740638f6e30455f4640fd7094d291d655a91ef57a"} err="failed to get container status \"df541305cc4f686f0e6f186740638f6e30455f4640fd7094d291d655a91ef57a\": rpc error: code = NotFound desc = could not find container \"df541305cc4f686f0e6f186740638f6e30455f4640fd7094d291d655a91ef57a\": container with ID starting with df541305cc4f686f0e6f186740638f6e30455f4640fd7094d291d655a91ef57a not found: ID does not exist" Mar 09 09:12:35 crc kubenswrapper[4792]: I0309 09:12:35.711048 4792 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a182a6f5-dc77-47a5-838b-af497ad11af6" Mar 09 09:12:35 crc kubenswrapper[4792]: I0309 09:12:35.974327 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:12:36 crc kubenswrapper[4792]: I0309 09:12:36.115875 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 09:12:36 crc kubenswrapper[4792]: I0309 09:12:36.383218 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 09:12:36 crc kubenswrapper[4792]: I0309 09:12:36.904807 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 09:12:37 crc kubenswrapper[4792]: I0309 09:12:37.015640 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 09:12:37 crc kubenswrapper[4792]: I0309 09:12:37.042371 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 09:12:37 crc kubenswrapper[4792]: I0309 09:12:37.059611 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 09:12:37 crc kubenswrapper[4792]: I0309 09:12:37.977520 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 09:12:38 crc kubenswrapper[4792]: I0309 09:12:38.159274 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 09:12:38 crc kubenswrapper[4792]: I0309 09:12:38.342791 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 09:12:38 crc kubenswrapper[4792]: I0309 09:12:38.624621 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 09:12:38 crc kubenswrapper[4792]: I0309 09:12:38.787368 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 09:12:38 crc kubenswrapper[4792]: I0309 09:12:38.857814 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.147485 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.148579 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.198304 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.314943 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.407957 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.485926 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.555973 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.730168 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.773903 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.781119 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.815513 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.914299 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.925671 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 09:12:39 crc kubenswrapper[4792]: I0309 09:12:39.997426 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.028314 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.172451 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.397579 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.431461 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.488767 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.502531 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.656394 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.775900 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.779396 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.800816 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.829774 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.891226 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.961773 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 09:12:40 crc kubenswrapper[4792]: I0309 09:12:40.996961 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.004002 4792 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.045882 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.161456 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.202205 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.268398 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.295686 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.296278 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.320839 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.332216 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.414653 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.564427 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.606481 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.625322 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.672534 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.725601 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.756837 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.804440 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.820934 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.821328 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 09:12:41 crc kubenswrapper[4792]: I0309 09:12:41.955826 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.068596 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.103692 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.152331 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.309187 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.441598 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.486531 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.526531 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.616090 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.619210 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.637623 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.645715 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.659450 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.685049 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.720668 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.722099 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.860745 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.924280 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 09:12:42 crc kubenswrapper[4792]: I0309 09:12:42.996216 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.008004 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.017953 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.051499 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.116249 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.122578 4792 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.159108 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.170227 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.198059 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.330342 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.463506 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.485353 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.602517 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.615099 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.654369 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.694542 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.721031 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.785290 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.844325 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.857784 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.864501 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.947502 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 09:12:43 crc kubenswrapper[4792]: I0309 09:12:43.977765 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 09:12:44 crc kubenswrapper[4792]: I0309 09:12:44.009995 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 09:12:44 crc kubenswrapper[4792]: I0309 09:12:44.199499 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 09:12:44 crc kubenswrapper[4792]: I0309 09:12:44.219732 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:12:44 crc kubenswrapper[4792]: I0309 09:12:44.390436 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 09:12:44 crc kubenswrapper[4792]: I0309 09:12:44.410710 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 09:12:44 crc kubenswrapper[4792]: I0309 09:12:44.444801 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 09:12:44 crc kubenswrapper[4792]: I0309 09:12:44.496808 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:12:44 crc kubenswrapper[4792]: I0309 09:12:44.646535 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 09:12:44 crc kubenswrapper[4792]: I0309 09:12:44.764877 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 09:12:44 crc kubenswrapper[4792]: I0309 09:12:44.788215 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 09:12:44 crc kubenswrapper[4792]: I0309 09:12:44.869397 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 09:12:44 crc kubenswrapper[4792]: I0309 09:12:44.898738 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 09:12:44 crc kubenswrapper[4792]: I0309 09:12:44.899394 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.061291 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.063186 4792 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.064306 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=40.064276949 podStartE2EDuration="40.064276949s" podCreationTimestamp="2026-03-09 09:12:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:12:25.94941977 +0000 UTC m=+310.979620522" watchObservedRunningTime="2026-03-09 09:12:45.064276949 +0000 UTC m=+330.094477741" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.074609 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tl5jf","openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.074718 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-f6744b585-q7wcd"] Mar 09 09:12:45 crc kubenswrapper[4792]: E0309 09:12:45.075106 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf7cc5f-5027-4382-bac8-ed4f459fe424" containerName="oauth-openshift" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.075192 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf7cc5f-5027-4382-bac8-ed4f459fe424" containerName="oauth-openshift" Mar 09 09:12:45 crc kubenswrapper[4792]: E0309 09:12:45.075228 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" containerName="oc" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.075246 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" containerName="oc" Mar 09 09:12:45 crc kubenswrapper[4792]: E0309 09:12:45.075280 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a30a83e-e71a-41e6-8946-34b1a6100a67" containerName="installer" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.075299 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a30a83e-e71a-41e6-8946-34b1a6100a67" containerName="installer" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.075309 4792 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7e46817-10cf-448c-8a2a-154f1c322ce6" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.075342 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e7e46817-10cf-448c-8a2a-154f1c322ce6" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.075565 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf7cc5f-5027-4382-bac8-ed4f459fe424" containerName="oauth-openshift" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.075594 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" containerName="oc" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.075627 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a30a83e-e71a-41e6-8946-34b1a6100a67" containerName="installer" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.076410 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.082702 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.089530 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.089675 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.089765 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.090838 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.090934 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.091022 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.091195 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.091328 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.093374 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.100004 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.102040 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.103822 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.110627 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.112813 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.126421 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.134712 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.137155 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.165759 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.165816 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-session\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.165842 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-service-ca\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.165885 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-user-template-login\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.165915 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.165943 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfjgx\" (UniqueName: \"kubernetes.io/projected/c704bb94-c986-4eaa-b570-e5d6c3b6295b-kube-api-access-qfjgx\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.165982 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-user-template-error\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.166007 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c704bb94-c986-4eaa-b570-e5d6c3b6295b-audit-policies\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.166033 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.166061 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-router-certs\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.166107 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.166140 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.166164 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c704bb94-c986-4eaa-b570-e5d6c3b6295b-audit-dir\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.166193 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.178904 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.186025 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.186008357 podStartE2EDuration="19.186008357s" podCreationTimestamp="2026-03-09 09:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:12:45.157901876 +0000 UTC m=+330.188102658" watchObservedRunningTime="2026-03-09 09:12:45.186008357 +0000 UTC m=+330.216209109" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.210354 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267567 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-service-ca\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267611 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-session\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267647 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-user-template-login\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267674 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267702 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfjgx\" (UniqueName: \"kubernetes.io/projected/c704bb94-c986-4eaa-b570-e5d6c3b6295b-kube-api-access-qfjgx\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267740 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-user-template-error\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267764 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c704bb94-c986-4eaa-b570-e5d6c3b6295b-audit-policies\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267788 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267812 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-router-certs\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267840 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267861 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267890 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c704bb94-c986-4eaa-b570-e5d6c3b6295b-audit-dir\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267919 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.267951 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.268768 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.268990 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c704bb94-c986-4eaa-b570-e5d6c3b6295b-audit-dir\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.269293 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.269399 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c704bb94-c986-4eaa-b570-e5d6c3b6295b-audit-policies\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.269748 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-service-ca\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.275946 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.277348 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.278030 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.278395 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-router-certs\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.278452 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-session\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.282395 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-user-template-error\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.282870 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.282914 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c704bb94-c986-4eaa-b570-e5d6c3b6295b-v4-0-config-user-template-login\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.298495 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfjgx\" (UniqueName: \"kubernetes.io/projected/c704bb94-c986-4eaa-b570-e5d6c3b6295b-kube-api-access-qfjgx\") pod \"oauth-openshift-f6744b585-q7wcd\" (UID: \"c704bb94-c986-4eaa-b570-e5d6c3b6295b\") " pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.318575 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.355419 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.403365 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.409146 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.493811 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.522438 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.629866 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.669742 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf7cc5f-5027-4382-bac8-ed4f459fe424" path="/var/lib/kubelet/pods/adf7cc5f-5027-4382-bac8-ed4f459fe424/volumes" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.689935 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.705226 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.709661 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.728937 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.751910 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.773300 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.880725 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.918604 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 09:12:45 crc kubenswrapper[4792]: I0309 09:12:45.993859 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.037998 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.114551 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.118643 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.138450 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.240842 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.294731 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.302027 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.312970 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.350536 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.358053 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.417605 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.453089 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.473622 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.486738 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.490770 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.519312 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.531437 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.550331 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.575117 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.720508 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.890405 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 09:12:46 crc kubenswrapper[4792]: I0309 09:12:46.925911 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.005794 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.214080 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.237974 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.254630 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.291639 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.319920 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.359752 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.398325 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.422147 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f6744b585-q7wcd"] Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.477253 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.489385 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.538634 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.559877 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.572112 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.596546 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.645862 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.696006 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.710729 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.719854 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.836348 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.900681 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 09:12:47 crc kubenswrapper[4792]: I0309 09:12:47.948229 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.161949 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.225405 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.256156 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-f6744b585-q7wcd_c704bb94-c986-4eaa-b570-e5d6c3b6295b/oauth-openshift/0.log" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.256211 4792 generic.go:334] "Generic (PLEG): container finished" podID="c704bb94-c986-4eaa-b570-e5d6c3b6295b" containerID="80d5c2b45bbefb87eb55834a1d7d562db9245b381ba7cc4fe3154de31c670eb6" exitCode=255 Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.256294 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" event={"ID":"c704bb94-c986-4eaa-b570-e5d6c3b6295b","Type":"ContainerDied","Data":"80d5c2b45bbefb87eb55834a1d7d562db9245b381ba7cc4fe3154de31c670eb6"} Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.256323 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" event={"ID":"c704bb94-c986-4eaa-b570-e5d6c3b6295b","Type":"ContainerStarted","Data":"c51ba93186772b12bcaab04e84235bea2098a16ea73964cab743dd7b60164593"} Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.257046 4792 scope.go:117] "RemoveContainer" containerID="80d5c2b45bbefb87eb55834a1d7d562db9245b381ba7cc4fe3154de31c670eb6" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.263365 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.271826 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.328961 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.359046 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.503485 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.522401 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.545950 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.555333 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.602586 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.605194 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.629866 4792 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.630130 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://3e73881ba97f1cf0bce321ba3597901e71ca01a914dc944fd4d20800266bad17" gracePeriod=5 Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.678406 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.774927 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.796422 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.910521 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.934304 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.948561 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 09:12:48 crc kubenswrapper[4792]: I0309 09:12:48.999277 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.005988 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.029313 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.130887 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.143630 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.169975 4792 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.221369 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.226700 4792 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.240048 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.261444 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-f6744b585-q7wcd_c704bb94-c986-4eaa-b570-e5d6c3b6295b/oauth-openshift/1.log" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.261773 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-f6744b585-q7wcd_c704bb94-c986-4eaa-b570-e5d6c3b6295b/oauth-openshift/0.log" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.261804 4792 generic.go:334] "Generic (PLEG): container finished" podID="c704bb94-c986-4eaa-b570-e5d6c3b6295b" containerID="088d1cc10ca67a2549f78a6f0a58e6b839eeaa6ebfa5d5ef6aa9bf85578dfbf9" exitCode=255 Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.261864 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" event={"ID":"c704bb94-c986-4eaa-b570-e5d6c3b6295b","Type":"ContainerDied","Data":"088d1cc10ca67a2549f78a6f0a58e6b839eeaa6ebfa5d5ef6aa9bf85578dfbf9"} Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.261896 4792 scope.go:117] "RemoveContainer" containerID="80d5c2b45bbefb87eb55834a1d7d562db9245b381ba7cc4fe3154de31c670eb6" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.262359 4792 scope.go:117] "RemoveContainer" containerID="088d1cc10ca67a2549f78a6f0a58e6b839eeaa6ebfa5d5ef6aa9bf85578dfbf9" Mar 09 09:12:49 crc kubenswrapper[4792]: E0309 09:12:49.262606 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-f6744b585-q7wcd_openshift-authentication(c704bb94-c986-4eaa-b570-e5d6c3b6295b)\"" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" podUID="c704bb94-c986-4eaa-b570-e5d6c3b6295b" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.487102 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.492709 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.554833 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.673265 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.684998 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.704433 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.967632 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.967632 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 09:12:49 crc kubenswrapper[4792]: I0309 09:12:49.971678 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.080546 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.126391 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.180628 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.231042 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.269552 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-f6744b585-q7wcd_c704bb94-c986-4eaa-b570-e5d6c3b6295b/oauth-openshift/1.log" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.270258 4792 scope.go:117] "RemoveContainer" containerID="088d1cc10ca67a2549f78a6f0a58e6b839eeaa6ebfa5d5ef6aa9bf85578dfbf9" Mar 09 09:12:50 crc kubenswrapper[4792]: E0309 09:12:50.270436 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-f6744b585-q7wcd_openshift-authentication(c704bb94-c986-4eaa-b570-e5d6c3b6295b)\"" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" podUID="c704bb94-c986-4eaa-b570-e5d6c3b6295b" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.271817 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.323399 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.467262 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.493790 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.510967 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.739671 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.750467 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 09:12:50 crc kubenswrapper[4792]: I0309 09:12:50.953208 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 09:12:51 crc kubenswrapper[4792]: I0309 09:12:51.081638 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 09:12:51 crc kubenswrapper[4792]: I0309 09:12:51.224625 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 09:12:51 crc kubenswrapper[4792]: I0309 09:12:51.597720 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 09:12:51 crc kubenswrapper[4792]: I0309 09:12:51.686831 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 09:12:51 crc kubenswrapper[4792]: I0309 09:12:51.815227 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 09:12:52 crc kubenswrapper[4792]: I0309 09:12:52.028273 4792 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 09:12:52 crc kubenswrapper[4792]: I0309 09:12:52.115844 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 09:12:52 crc kubenswrapper[4792]: I0309 09:12:52.549328 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 09:12:52 crc kubenswrapper[4792]: I0309 09:12:52.642870 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.116491 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.413121 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.738162 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.738272 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.790335 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.790418 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.790445 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.790469 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.790541 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.790502 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.790609 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.790682 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.790794 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.791042 4792 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.791079 4792 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.791092 4792 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.791103 4792 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.800456 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:12:53 crc kubenswrapper[4792]: I0309 09:12:53.892525 4792 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:54 crc kubenswrapper[4792]: I0309 09:12:54.293519 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 09:12:54 crc kubenswrapper[4792]: I0309 09:12:54.293569 4792 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="3e73881ba97f1cf0bce321ba3597901e71ca01a914dc944fd4d20800266bad17" exitCode=137 Mar 09 09:12:54 crc kubenswrapper[4792]: I0309 09:12:54.293613 4792 scope.go:117] "RemoveContainer" containerID="3e73881ba97f1cf0bce321ba3597901e71ca01a914dc944fd4d20800266bad17" Mar 09 09:12:54 crc kubenswrapper[4792]: I0309 09:12:54.294161 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:12:54 crc kubenswrapper[4792]: I0309 09:12:54.327129 4792 scope.go:117] "RemoveContainer" containerID="3e73881ba97f1cf0bce321ba3597901e71ca01a914dc944fd4d20800266bad17" Mar 09 09:12:54 crc kubenswrapper[4792]: E0309 09:12:54.327805 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e73881ba97f1cf0bce321ba3597901e71ca01a914dc944fd4d20800266bad17\": container with ID starting with 3e73881ba97f1cf0bce321ba3597901e71ca01a914dc944fd4d20800266bad17 not found: ID does not exist" containerID="3e73881ba97f1cf0bce321ba3597901e71ca01a914dc944fd4d20800266bad17" Mar 09 09:12:54 crc kubenswrapper[4792]: I0309 09:12:54.327871 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e73881ba97f1cf0bce321ba3597901e71ca01a914dc944fd4d20800266bad17"} err="failed to get container status \"3e73881ba97f1cf0bce321ba3597901e71ca01a914dc944fd4d20800266bad17\": rpc error: code = NotFound desc = could not find container \"3e73881ba97f1cf0bce321ba3597901e71ca01a914dc944fd4d20800266bad17\": container with ID starting with 3e73881ba97f1cf0bce321ba3597901e71ca01a914dc944fd4d20800266bad17 not found: ID does not exist" Mar 09 09:12:54 crc kubenswrapper[4792]: I0309 09:12:54.425512 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 09:12:55 crc kubenswrapper[4792]: I0309 09:12:55.410878 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:55 crc kubenswrapper[4792]: I0309 09:12:55.411244 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:12:55 crc kubenswrapper[4792]: I0309 09:12:55.411908 4792 scope.go:117] "RemoveContainer" containerID="088d1cc10ca67a2549f78a6f0a58e6b839eeaa6ebfa5d5ef6aa9bf85578dfbf9" Mar 09 09:12:55 crc kubenswrapper[4792]: E0309 09:12:55.412123 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-f6744b585-q7wcd_openshift-authentication(c704bb94-c986-4eaa-b570-e5d6c3b6295b)\"" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" podUID="c704bb94-c986-4eaa-b570-e5d6c3b6295b" Mar 09 09:12:55 crc kubenswrapper[4792]: I0309 09:12:55.669843 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 09 09:12:55 crc kubenswrapper[4792]: I0309 09:12:55.670384 4792 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 09 09:12:55 crc kubenswrapper[4792]: I0309 09:12:55.684955 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 09:12:55 crc kubenswrapper[4792]: I0309 09:12:55.685006 4792 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="18ea06d0-fcdc-4150-bba4-c2e8bb69d4c6" Mar 09 09:12:55 crc kubenswrapper[4792]: I0309 09:12:55.691699 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 09:12:55 crc kubenswrapper[4792]: I0309 09:12:55.691799 4792 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="18ea06d0-fcdc-4150-bba4-c2e8bb69d4c6" Mar 09 09:13:07 crc kubenswrapper[4792]: I0309 09:13:07.663484 4792 scope.go:117] "RemoveContainer" containerID="088d1cc10ca67a2549f78a6f0a58e6b839eeaa6ebfa5d5ef6aa9bf85578dfbf9" Mar 09 09:13:08 crc kubenswrapper[4792]: I0309 09:13:08.653776 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-f6744b585-q7wcd_c704bb94-c986-4eaa-b570-e5d6c3b6295b/oauth-openshift/1.log" Mar 09 09:13:08 crc kubenswrapper[4792]: I0309 09:13:08.653830 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" event={"ID":"c704bb94-c986-4eaa-b570-e5d6c3b6295b","Type":"ContainerStarted","Data":"73a37d84e577f5cc50c324c42d90acc3c8b6d71e1392e770860a504b2f25801a"} Mar 09 09:13:08 crc kubenswrapper[4792]: I0309 09:13:08.654110 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:13:08 crc kubenswrapper[4792]: I0309 09:13:08.660702 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" Mar 09 09:13:08 crc kubenswrapper[4792]: I0309 09:13:08.711581 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-f6744b585-q7wcd" podStartSLOduration=67.711557567 podStartE2EDuration="1m7.711557567s" podCreationTimestamp="2026-03-09 09:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:13:08.687357386 +0000 UTC m=+353.717558158" watchObservedRunningTime="2026-03-09 09:13:08.711557567 +0000 UTC m=+353.741758319" Mar 09 09:13:09 crc kubenswrapper[4792]: I0309 09:13:09.662491 4792 generic.go:334] "Generic (PLEG): container finished" podID="488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" containerID="356800b3e1ca7e31efdd5336d28b223e15989f4947eff2afd70a93b27f45c61c" exitCode=0 Mar 09 09:13:09 crc kubenswrapper[4792]: I0309 09:13:09.670464 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" event={"ID":"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd","Type":"ContainerDied","Data":"356800b3e1ca7e31efdd5336d28b223e15989f4947eff2afd70a93b27f45c61c"} Mar 09 09:13:09 crc kubenswrapper[4792]: I0309 09:13:09.671247 4792 scope.go:117] "RemoveContainer" containerID="356800b3e1ca7e31efdd5336d28b223e15989f4947eff2afd70a93b27f45c61c" Mar 09 09:13:10 crc kubenswrapper[4792]: I0309 09:13:10.670813 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" event={"ID":"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd","Type":"ContainerStarted","Data":"2879877d0bda1b71228696a57f9ec7b1e55bbb5d2e1079338571a2cffc01949a"} Mar 09 09:13:10 crc kubenswrapper[4792]: I0309 09:13:10.671736 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:13:10 crc kubenswrapper[4792]: I0309 09:13:10.676735 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.132838 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550794-wqfmq"] Mar 09 09:14:00 crc kubenswrapper[4792]: E0309 09:14:00.134043 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.134059 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.134208 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.134769 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550794-wqfmq" Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.137542 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.139949 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.141868 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.144550 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550794-wqfmq"] Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.199822 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6z4z\" (UniqueName: \"kubernetes.io/projected/4b589961-be09-4888-9ee2-49bf55db091a-kube-api-access-c6z4z\") pod \"auto-csr-approver-29550794-wqfmq\" (UID: \"4b589961-be09-4888-9ee2-49bf55db091a\") " pod="openshift-infra/auto-csr-approver-29550794-wqfmq" Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.301163 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6z4z\" (UniqueName: \"kubernetes.io/projected/4b589961-be09-4888-9ee2-49bf55db091a-kube-api-access-c6z4z\") pod \"auto-csr-approver-29550794-wqfmq\" (UID: \"4b589961-be09-4888-9ee2-49bf55db091a\") " pod="openshift-infra/auto-csr-approver-29550794-wqfmq" Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.323656 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6z4z\" (UniqueName: \"kubernetes.io/projected/4b589961-be09-4888-9ee2-49bf55db091a-kube-api-access-c6z4z\") pod \"auto-csr-approver-29550794-wqfmq\" (UID: \"4b589961-be09-4888-9ee2-49bf55db091a\") " pod="openshift-infra/auto-csr-approver-29550794-wqfmq" Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.452832 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550794-wqfmq" Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.905342 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550794-wqfmq"] Mar 09 09:14:00 crc kubenswrapper[4792]: W0309 09:14:00.916835 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b589961_be09_4888_9ee2_49bf55db091a.slice/crio-b39ce7f509754e52f547bfdc57cc2533834e63283c7b4ed630955523d47d4ab6 WatchSource:0}: Error finding container b39ce7f509754e52f547bfdc57cc2533834e63283c7b4ed630955523d47d4ab6: Status 404 returned error can't find the container with id b39ce7f509754e52f547bfdc57cc2533834e63283c7b4ed630955523d47d4ab6 Mar 09 09:14:00 crc kubenswrapper[4792]: I0309 09:14:00.988356 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550794-wqfmq" event={"ID":"4b589961-be09-4888-9ee2-49bf55db091a","Type":"ContainerStarted","Data":"b39ce7f509754e52f547bfdc57cc2533834e63283c7b4ed630955523d47d4ab6"} Mar 09 09:14:03 crc kubenswrapper[4792]: I0309 09:14:03.003648 4792 generic.go:334] "Generic (PLEG): container finished" podID="4b589961-be09-4888-9ee2-49bf55db091a" containerID="f3ad56061bf5596cb1f3785983e5f3c56a9884125309ab977bf20b656b7ae980" exitCode=0 Mar 09 09:14:03 crc kubenswrapper[4792]: I0309 09:14:03.004107 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550794-wqfmq" event={"ID":"4b589961-be09-4888-9ee2-49bf55db091a","Type":"ContainerDied","Data":"f3ad56061bf5596cb1f3785983e5f3c56a9884125309ab977bf20b656b7ae980"} Mar 09 09:14:04 crc kubenswrapper[4792]: I0309 09:14:04.331096 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550794-wqfmq" Mar 09 09:14:04 crc kubenswrapper[4792]: I0309 09:14:04.462686 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6z4z\" (UniqueName: \"kubernetes.io/projected/4b589961-be09-4888-9ee2-49bf55db091a-kube-api-access-c6z4z\") pod \"4b589961-be09-4888-9ee2-49bf55db091a\" (UID: \"4b589961-be09-4888-9ee2-49bf55db091a\") " Mar 09 09:14:04 crc kubenswrapper[4792]: I0309 09:14:04.469506 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b589961-be09-4888-9ee2-49bf55db091a-kube-api-access-c6z4z" (OuterVolumeSpecName: "kube-api-access-c6z4z") pod "4b589961-be09-4888-9ee2-49bf55db091a" (UID: "4b589961-be09-4888-9ee2-49bf55db091a"). InnerVolumeSpecName "kube-api-access-c6z4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:14:04 crc kubenswrapper[4792]: I0309 09:14:04.564646 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6z4z\" (UniqueName: \"kubernetes.io/projected/4b589961-be09-4888-9ee2-49bf55db091a-kube-api-access-c6z4z\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:05 crc kubenswrapper[4792]: I0309 09:14:05.021835 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550794-wqfmq" event={"ID":"4b589961-be09-4888-9ee2-49bf55db091a","Type":"ContainerDied","Data":"b39ce7f509754e52f547bfdc57cc2533834e63283c7b4ed630955523d47d4ab6"} Mar 09 09:14:05 crc kubenswrapper[4792]: I0309 09:14:05.021904 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b39ce7f509754e52f547bfdc57cc2533834e63283c7b4ed630955523d47d4ab6" Mar 09 09:14:05 crc kubenswrapper[4792]: I0309 09:14:05.021934 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550794-wqfmq" Mar 09 09:14:13 crc kubenswrapper[4792]: I0309 09:14:13.214163 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:14:13 crc kubenswrapper[4792]: I0309 09:14:13.215212 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.716905 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t2jhd"] Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.717929 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t2jhd" podUID="69535a14-c11d-442a-837d-f1d6744cb530" containerName="registry-server" containerID="cri-o://c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8" gracePeriod=30 Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.730671 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hkmv"] Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.730997 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8hkmv" podUID="047810b2-277c-4d4c-822a-98b6d2a91fcc" containerName="registry-server" containerID="cri-o://29997cf552b1be261417eea2595d97e67a0cea55f8ee28656b602e5aee87155c" gracePeriod=30 Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.743082 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgrs4"] Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.743346 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" podUID="488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" containerName="marketplace-operator" containerID="cri-o://2879877d0bda1b71228696a57f9ec7b1e55bbb5d2e1079338571a2cffc01949a" gracePeriod=30 Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.751716 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wm7x"] Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.752110 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8wm7x" podUID="8606aa7f-7b07-40df-b9b8-f415a5e68b47" containerName="registry-server" containerID="cri-o://8278f4c556a3c361ee76de1ec2559c92d8351c0452d5c28590575a683e28f53f" gracePeriod=30 Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.769881 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m64ct"] Mar 09 09:14:29 crc kubenswrapper[4792]: E0309 09:14:29.770167 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b589961-be09-4888-9ee2-49bf55db091a" containerName="oc" Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.770189 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b589961-be09-4888-9ee2-49bf55db091a" containerName="oc" Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.770316 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b589961-be09-4888-9ee2-49bf55db091a" containerName="oc" Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.770762 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.777799 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jsnbn"] Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.778018 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jsnbn" podUID="0667075c-38b7-4fb6-ad69-a31987eae3cc" containerName="registry-server" containerID="cri-o://3a9bd1731a5918976ad5f77b3bae28fac98086487f7a1cbcc0de527b1f267e90" gracePeriod=30 Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.796979 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m64ct"] Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.856698 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589998d-961b-4184-9884-0ad5eee48348-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m64ct\" (UID: \"0589998d-961b-4184-9884-0ad5eee48348\") " pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.856764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw554\" (UniqueName: \"kubernetes.io/projected/0589998d-961b-4184-9884-0ad5eee48348-kube-api-access-sw554\") pod \"marketplace-operator-79b997595-m64ct\" (UID: \"0589998d-961b-4184-9884-0ad5eee48348\") " pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.856787 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0589998d-961b-4184-9884-0ad5eee48348-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m64ct\" (UID: \"0589998d-961b-4184-9884-0ad5eee48348\") " pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.957467 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589998d-961b-4184-9884-0ad5eee48348-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m64ct\" (UID: \"0589998d-961b-4184-9884-0ad5eee48348\") " pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.957546 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw554\" (UniqueName: \"kubernetes.io/projected/0589998d-961b-4184-9884-0ad5eee48348-kube-api-access-sw554\") pod \"marketplace-operator-79b997595-m64ct\" (UID: \"0589998d-961b-4184-9884-0ad5eee48348\") " pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.957567 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0589998d-961b-4184-9884-0ad5eee48348-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m64ct\" (UID: \"0589998d-961b-4184-9884-0ad5eee48348\") " pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.959230 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0589998d-961b-4184-9884-0ad5eee48348-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m64ct\" (UID: \"0589998d-961b-4184-9884-0ad5eee48348\") " pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.968005 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0589998d-961b-4184-9884-0ad5eee48348-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m64ct\" (UID: \"0589998d-961b-4184-9884-0ad5eee48348\") " pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" Mar 09 09:14:29 crc kubenswrapper[4792]: I0309 09:14:29.993239 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw554\" (UniqueName: \"kubernetes.io/projected/0589998d-961b-4184-9884-0ad5eee48348-kube-api-access-sw554\") pod \"marketplace-operator-79b997595-m64ct\" (UID: \"0589998d-961b-4184-9884-0ad5eee48348\") " pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.094332 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.140801 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.188997 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.191756 4792 generic.go:334] "Generic (PLEG): container finished" podID="0667075c-38b7-4fb6-ad69-a31987eae3cc" containerID="3a9bd1731a5918976ad5f77b3bae28fac98086487f7a1cbcc0de527b1f267e90" exitCode=0 Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.191803 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsnbn" event={"ID":"0667075c-38b7-4fb6-ad69-a31987eae3cc","Type":"ContainerDied","Data":"3a9bd1731a5918976ad5f77b3bae28fac98086487f7a1cbcc0de527b1f267e90"} Mar 09 09:14:30 crc kubenswrapper[4792]: E0309 09:14:30.193174 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69535a14_c11d_442a_837d_f1d6744cb530.slice/crio-conmon-c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8.scope\": RecentStats: unable to find data in memory cache]" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.199652 4792 generic.go:334] "Generic (PLEG): container finished" podID="69535a14-c11d-442a-837d-f1d6744cb530" containerID="c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8" exitCode=0 Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.200085 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2jhd" event={"ID":"69535a14-c11d-442a-837d-f1d6744cb530","Type":"ContainerDied","Data":"c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8"} Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.201855 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t2jhd" event={"ID":"69535a14-c11d-442a-837d-f1d6744cb530","Type":"ContainerDied","Data":"c1f9edc92ab1748cf13549f8902d5a0ea8ba2420f2858e6080f6aafff25ab893"} Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.201888 4792 scope.go:117] "RemoveContainer" containerID="c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.201982 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t2jhd" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.229914 4792 scope.go:117] "RemoveContainer" containerID="d2b10ee92a9779d15f6d383d469f9c54a2a5c72d8b8e4181b2214aa0a5a6ce91" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.234888 4792 generic.go:334] "Generic (PLEG): container finished" podID="047810b2-277c-4d4c-822a-98b6d2a91fcc" containerID="29997cf552b1be261417eea2595d97e67a0cea55f8ee28656b602e5aee87155c" exitCode=0 Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.235033 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hkmv" event={"ID":"047810b2-277c-4d4c-822a-98b6d2a91fcc","Type":"ContainerDied","Data":"29997cf552b1be261417eea2595d97e67a0cea55f8ee28656b602e5aee87155c"} Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.247268 4792 generic.go:334] "Generic (PLEG): container finished" podID="488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" containerID="2879877d0bda1b71228696a57f9ec7b1e55bbb5d2e1079338571a2cffc01949a" exitCode=0 Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.247419 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" event={"ID":"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd","Type":"ContainerDied","Data":"2879877d0bda1b71228696a57f9ec7b1e55bbb5d2e1079338571a2cffc01949a"} Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.249484 4792 generic.go:334] "Generic (PLEG): container finished" podID="8606aa7f-7b07-40df-b9b8-f415a5e68b47" containerID="8278f4c556a3c361ee76de1ec2559c92d8351c0452d5c28590575a683e28f53f" exitCode=0 Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.249504 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wm7x" event={"ID":"8606aa7f-7b07-40df-b9b8-f415a5e68b47","Type":"ContainerDied","Data":"8278f4c556a3c361ee76de1ec2559c92d8351c0452d5c28590575a683e28f53f"} Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.249518 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wm7x" event={"ID":"8606aa7f-7b07-40df-b9b8-f415a5e68b47","Type":"ContainerDied","Data":"1e448a8e07592f1f8b2e44bb7c61b0e7f69e72bcc9f3209b1c84e0e8f9f03744"} Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.249581 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wm7x" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.261260 4792 scope.go:117] "RemoveContainer" containerID="5e9a4dcf83e7458f8f8bbefb6ed7b2354b23971b3aac7969732fc50d8fa4d880" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.264464 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8606aa7f-7b07-40df-b9b8-f415a5e68b47-catalog-content\") pod \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\" (UID: \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.264536 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69535a14-c11d-442a-837d-f1d6744cb530-catalog-content\") pod \"69535a14-c11d-442a-837d-f1d6744cb530\" (UID: \"69535a14-c11d-442a-837d-f1d6744cb530\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.264563 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdjsz\" (UniqueName: \"kubernetes.io/projected/8606aa7f-7b07-40df-b9b8-f415a5e68b47-kube-api-access-mdjsz\") pod \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\" (UID: \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.264630 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69535a14-c11d-442a-837d-f1d6744cb530-utilities\") pod \"69535a14-c11d-442a-837d-f1d6744cb530\" (UID: \"69535a14-c11d-442a-837d-f1d6744cb530\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.264662 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8606aa7f-7b07-40df-b9b8-f415a5e68b47-utilities\") pod \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\" (UID: \"8606aa7f-7b07-40df-b9b8-f415a5e68b47\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.264691 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lq57\" (UniqueName: \"kubernetes.io/projected/69535a14-c11d-442a-837d-f1d6744cb530-kube-api-access-4lq57\") pod \"69535a14-c11d-442a-837d-f1d6744cb530\" (UID: \"69535a14-c11d-442a-837d-f1d6744cb530\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.266225 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69535a14-c11d-442a-837d-f1d6744cb530-utilities" (OuterVolumeSpecName: "utilities") pod "69535a14-c11d-442a-837d-f1d6744cb530" (UID: "69535a14-c11d-442a-837d-f1d6744cb530"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.266978 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8606aa7f-7b07-40df-b9b8-f415a5e68b47-utilities" (OuterVolumeSpecName: "utilities") pod "8606aa7f-7b07-40df-b9b8-f415a5e68b47" (UID: "8606aa7f-7b07-40df-b9b8-f415a5e68b47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.278595 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69535a14-c11d-442a-837d-f1d6744cb530-kube-api-access-4lq57" (OuterVolumeSpecName: "kube-api-access-4lq57") pod "69535a14-c11d-442a-837d-f1d6744cb530" (UID: "69535a14-c11d-442a-837d-f1d6744cb530"). InnerVolumeSpecName "kube-api-access-4lq57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.278675 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8606aa7f-7b07-40df-b9b8-f415a5e68b47-kube-api-access-mdjsz" (OuterVolumeSpecName: "kube-api-access-mdjsz") pod "8606aa7f-7b07-40df-b9b8-f415a5e68b47" (UID: "8606aa7f-7b07-40df-b9b8-f415a5e68b47"). InnerVolumeSpecName "kube-api-access-mdjsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.283619 4792 scope.go:117] "RemoveContainer" containerID="c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8" Mar 09 09:14:30 crc kubenswrapper[4792]: E0309 09:14:30.285634 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8\": container with ID starting with c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8 not found: ID does not exist" containerID="c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.285677 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8"} err="failed to get container status \"c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8\": rpc error: code = NotFound desc = could not find container \"c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8\": container with ID starting with c9e6280fb058b35d9209ff0bf6cb923f78f0837a9f2659084987aa1c5476dac8 not found: ID does not exist" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.285708 4792 scope.go:117] "RemoveContainer" containerID="d2b10ee92a9779d15f6d383d469f9c54a2a5c72d8b8e4181b2214aa0a5a6ce91" Mar 09 09:14:30 crc kubenswrapper[4792]: E0309 09:14:30.287208 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2b10ee92a9779d15f6d383d469f9c54a2a5c72d8b8e4181b2214aa0a5a6ce91\": container with ID starting with d2b10ee92a9779d15f6d383d469f9c54a2a5c72d8b8e4181b2214aa0a5a6ce91 not found: ID does not exist" containerID="d2b10ee92a9779d15f6d383d469f9c54a2a5c72d8b8e4181b2214aa0a5a6ce91" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.287236 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2b10ee92a9779d15f6d383d469f9c54a2a5c72d8b8e4181b2214aa0a5a6ce91"} err="failed to get container status \"d2b10ee92a9779d15f6d383d469f9c54a2a5c72d8b8e4181b2214aa0a5a6ce91\": rpc error: code = NotFound desc = could not find container \"d2b10ee92a9779d15f6d383d469f9c54a2a5c72d8b8e4181b2214aa0a5a6ce91\": container with ID starting with d2b10ee92a9779d15f6d383d469f9c54a2a5c72d8b8e4181b2214aa0a5a6ce91 not found: ID does not exist" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.287252 4792 scope.go:117] "RemoveContainer" containerID="5e9a4dcf83e7458f8f8bbefb6ed7b2354b23971b3aac7969732fc50d8fa4d880" Mar 09 09:14:30 crc kubenswrapper[4792]: E0309 09:14:30.287559 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e9a4dcf83e7458f8f8bbefb6ed7b2354b23971b3aac7969732fc50d8fa4d880\": container with ID starting with 5e9a4dcf83e7458f8f8bbefb6ed7b2354b23971b3aac7969732fc50d8fa4d880 not found: ID does not exist" containerID="5e9a4dcf83e7458f8f8bbefb6ed7b2354b23971b3aac7969732fc50d8fa4d880" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.287577 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e9a4dcf83e7458f8f8bbefb6ed7b2354b23971b3aac7969732fc50d8fa4d880"} err="failed to get container status \"5e9a4dcf83e7458f8f8bbefb6ed7b2354b23971b3aac7969732fc50d8fa4d880\": rpc error: code = NotFound desc = could not find container \"5e9a4dcf83e7458f8f8bbefb6ed7b2354b23971b3aac7969732fc50d8fa4d880\": container with ID starting with 5e9a4dcf83e7458f8f8bbefb6ed7b2354b23971b3aac7969732fc50d8fa4d880 not found: ID does not exist" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.287593 4792 scope.go:117] "RemoveContainer" containerID="356800b3e1ca7e31efdd5336d28b223e15989f4947eff2afd70a93b27f45c61c" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.310011 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8606aa7f-7b07-40df-b9b8-f415a5e68b47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8606aa7f-7b07-40df-b9b8-f415a5e68b47" (UID: "8606aa7f-7b07-40df-b9b8-f415a5e68b47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.328769 4792 scope.go:117] "RemoveContainer" containerID="8278f4c556a3c361ee76de1ec2559c92d8351c0452d5c28590575a683e28f53f" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.345341 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m64ct"] Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.348518 4792 scope.go:117] "RemoveContainer" containerID="01e44686b530731d6555ea659a282e8642532fd8317bb2537db520507c2a5389" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.362983 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69535a14-c11d-442a-837d-f1d6744cb530-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69535a14-c11d-442a-837d-f1d6744cb530" (UID: "69535a14-c11d-442a-837d-f1d6744cb530"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.366143 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8606aa7f-7b07-40df-b9b8-f415a5e68b47-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.366192 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lq57\" (UniqueName: \"kubernetes.io/projected/69535a14-c11d-442a-837d-f1d6744cb530-kube-api-access-4lq57\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.366208 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8606aa7f-7b07-40df-b9b8-f415a5e68b47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.366219 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69535a14-c11d-442a-837d-f1d6744cb530-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.366240 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdjsz\" (UniqueName: \"kubernetes.io/projected/8606aa7f-7b07-40df-b9b8-f415a5e68b47-kube-api-access-mdjsz\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.366252 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69535a14-c11d-442a-837d-f1d6744cb530-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:30 crc kubenswrapper[4792]: W0309 09:14:30.382104 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0589998d_961b_4184_9884_0ad5eee48348.slice/crio-5d97734c31076ed7a45d593686f973c26242af277552bdaadc4b9355a5b1c19c WatchSource:0}: Error finding container 5d97734c31076ed7a45d593686f973c26242af277552bdaadc4b9355a5b1c19c: Status 404 returned error can't find the container with id 5d97734c31076ed7a45d593686f973c26242af277552bdaadc4b9355a5b1c19c Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.387340 4792 scope.go:117] "RemoveContainer" containerID="a1391a99fc45bb492441a99ff7814d38f3c9f1abb528e937ebeffaa002859319" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.414423 4792 scope.go:117] "RemoveContainer" containerID="8278f4c556a3c361ee76de1ec2559c92d8351c0452d5c28590575a683e28f53f" Mar 09 09:14:30 crc kubenswrapper[4792]: E0309 09:14:30.417779 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8278f4c556a3c361ee76de1ec2559c92d8351c0452d5c28590575a683e28f53f\": container with ID starting with 8278f4c556a3c361ee76de1ec2559c92d8351c0452d5c28590575a683e28f53f not found: ID does not exist" containerID="8278f4c556a3c361ee76de1ec2559c92d8351c0452d5c28590575a683e28f53f" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.417815 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8278f4c556a3c361ee76de1ec2559c92d8351c0452d5c28590575a683e28f53f"} err="failed to get container status \"8278f4c556a3c361ee76de1ec2559c92d8351c0452d5c28590575a683e28f53f\": rpc error: code = NotFound desc = could not find container \"8278f4c556a3c361ee76de1ec2559c92d8351c0452d5c28590575a683e28f53f\": container with ID starting with 8278f4c556a3c361ee76de1ec2559c92d8351c0452d5c28590575a683e28f53f not found: ID does not exist" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.417837 4792 scope.go:117] "RemoveContainer" containerID="01e44686b530731d6555ea659a282e8642532fd8317bb2537db520507c2a5389" Mar 09 09:14:30 crc kubenswrapper[4792]: E0309 09:14:30.419360 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01e44686b530731d6555ea659a282e8642532fd8317bb2537db520507c2a5389\": container with ID starting with 01e44686b530731d6555ea659a282e8642532fd8317bb2537db520507c2a5389 not found: ID does not exist" containerID="01e44686b530731d6555ea659a282e8642532fd8317bb2537db520507c2a5389" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.419393 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e44686b530731d6555ea659a282e8642532fd8317bb2537db520507c2a5389"} err="failed to get container status \"01e44686b530731d6555ea659a282e8642532fd8317bb2537db520507c2a5389\": rpc error: code = NotFound desc = could not find container \"01e44686b530731d6555ea659a282e8642532fd8317bb2537db520507c2a5389\": container with ID starting with 01e44686b530731d6555ea659a282e8642532fd8317bb2537db520507c2a5389 not found: ID does not exist" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.419414 4792 scope.go:117] "RemoveContainer" containerID="a1391a99fc45bb492441a99ff7814d38f3c9f1abb528e937ebeffaa002859319" Mar 09 09:14:30 crc kubenswrapper[4792]: E0309 09:14:30.419989 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1391a99fc45bb492441a99ff7814d38f3c9f1abb528e937ebeffaa002859319\": container with ID starting with a1391a99fc45bb492441a99ff7814d38f3c9f1abb528e937ebeffaa002859319 not found: ID does not exist" containerID="a1391a99fc45bb492441a99ff7814d38f3c9f1abb528e937ebeffaa002859319" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.420012 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1391a99fc45bb492441a99ff7814d38f3c9f1abb528e937ebeffaa002859319"} err="failed to get container status \"a1391a99fc45bb492441a99ff7814d38f3c9f1abb528e937ebeffaa002859319\": rpc error: code = NotFound desc = could not find container \"a1391a99fc45bb492441a99ff7814d38f3c9f1abb528e937ebeffaa002859319\": container with ID starting with a1391a99fc45bb492441a99ff7814d38f3c9f1abb528e937ebeffaa002859319 not found: ID does not exist" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.588017 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.598556 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t2jhd"] Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.645720 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t2jhd"] Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.654206 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wm7x"] Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.656951 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wm7x"] Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.675580 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsfpd\" (UniqueName: \"kubernetes.io/projected/047810b2-277c-4d4c-822a-98b6d2a91fcc-kube-api-access-bsfpd\") pod \"047810b2-277c-4d4c-822a-98b6d2a91fcc\" (UID: \"047810b2-277c-4d4c-822a-98b6d2a91fcc\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.675656 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047810b2-277c-4d4c-822a-98b6d2a91fcc-catalog-content\") pod \"047810b2-277c-4d4c-822a-98b6d2a91fcc\" (UID: \"047810b2-277c-4d4c-822a-98b6d2a91fcc\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.675680 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047810b2-277c-4d4c-822a-98b6d2a91fcc-utilities\") pod \"047810b2-277c-4d4c-822a-98b6d2a91fcc\" (UID: \"047810b2-277c-4d4c-822a-98b6d2a91fcc\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.683425 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047810b2-277c-4d4c-822a-98b6d2a91fcc-utilities" (OuterVolumeSpecName: "utilities") pod "047810b2-277c-4d4c-822a-98b6d2a91fcc" (UID: "047810b2-277c-4d4c-822a-98b6d2a91fcc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.684725 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047810b2-277c-4d4c-822a-98b6d2a91fcc-kube-api-access-bsfpd" (OuterVolumeSpecName: "kube-api-access-bsfpd") pod "047810b2-277c-4d4c-822a-98b6d2a91fcc" (UID: "047810b2-277c-4d4c-822a-98b6d2a91fcc"). InnerVolumeSpecName "kube-api-access-bsfpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.715711 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.732058 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.750251 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047810b2-277c-4d4c-822a-98b6d2a91fcc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "047810b2-277c-4d4c-822a-98b6d2a91fcc" (UID: "047810b2-277c-4d4c-822a-98b6d2a91fcc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.776964 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsfpd\" (UniqueName: \"kubernetes.io/projected/047810b2-277c-4d4c-822a-98b6d2a91fcc-kube-api-access-bsfpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.776994 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/047810b2-277c-4d4c-822a-98b6d2a91fcc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.777004 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/047810b2-277c-4d4c-822a-98b6d2a91fcc-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.877575 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0667075c-38b7-4fb6-ad69-a31987eae3cc-catalog-content\") pod \"0667075c-38b7-4fb6-ad69-a31987eae3cc\" (UID: \"0667075c-38b7-4fb6-ad69-a31987eae3cc\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.878089 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2j2c\" (UniqueName: \"kubernetes.io/projected/0667075c-38b7-4fb6-ad69-a31987eae3cc-kube-api-access-m2j2c\") pod \"0667075c-38b7-4fb6-ad69-a31987eae3cc\" (UID: \"0667075c-38b7-4fb6-ad69-a31987eae3cc\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.878168 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-marketplace-operator-metrics\") pod \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\" (UID: \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.878190 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77qcg\" (UniqueName: \"kubernetes.io/projected/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-kube-api-access-77qcg\") pod \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\" (UID: \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.878229 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0667075c-38b7-4fb6-ad69-a31987eae3cc-utilities\") pod \"0667075c-38b7-4fb6-ad69-a31987eae3cc\" (UID: \"0667075c-38b7-4fb6-ad69-a31987eae3cc\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.878250 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-marketplace-trusted-ca\") pod \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\" (UID: \"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd\") " Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.879020 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" (UID: "488c4b65-9dcd-4303-a4b1-4c640fa9e0dd"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.880917 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0667075c-38b7-4fb6-ad69-a31987eae3cc-utilities" (OuterVolumeSpecName: "utilities") pod "0667075c-38b7-4fb6-ad69-a31987eae3cc" (UID: "0667075c-38b7-4fb6-ad69-a31987eae3cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.887231 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0667075c-38b7-4fb6-ad69-a31987eae3cc-kube-api-access-m2j2c" (OuterVolumeSpecName: "kube-api-access-m2j2c") pod "0667075c-38b7-4fb6-ad69-a31987eae3cc" (UID: "0667075c-38b7-4fb6-ad69-a31987eae3cc"). InnerVolumeSpecName "kube-api-access-m2j2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.887540 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" (UID: "488c4b65-9dcd-4303-a4b1-4c640fa9e0dd"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.889429 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-kube-api-access-77qcg" (OuterVolumeSpecName: "kube-api-access-77qcg") pod "488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" (UID: "488c4b65-9dcd-4303-a4b1-4c640fa9e0dd"). InnerVolumeSpecName "kube-api-access-77qcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.980234 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.980717 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77qcg\" (UniqueName: \"kubernetes.io/projected/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-kube-api-access-77qcg\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.980783 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0667075c-38b7-4fb6-ad69-a31987eae3cc-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.980866 4792 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:30 crc kubenswrapper[4792]: I0309 09:14:30.980924 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2j2c\" (UniqueName: \"kubernetes.io/projected/0667075c-38b7-4fb6-ad69-a31987eae3cc-kube-api-access-m2j2c\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.023358 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0667075c-38b7-4fb6-ad69-a31987eae3cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0667075c-38b7-4fb6-ad69-a31987eae3cc" (UID: "0667075c-38b7-4fb6-ad69-a31987eae3cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.081892 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0667075c-38b7-4fb6-ad69-a31987eae3cc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.257164 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jsnbn" event={"ID":"0667075c-38b7-4fb6-ad69-a31987eae3cc","Type":"ContainerDied","Data":"db3e667d578beaad49f5dd59c243f8ec8d7845f66db32a821bed09a6c5fc133c"} Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.257547 4792 scope.go:117] "RemoveContainer" containerID="3a9bd1731a5918976ad5f77b3bae28fac98086487f7a1cbcc0de527b1f267e90" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.257195 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jsnbn" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.261095 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hkmv" event={"ID":"047810b2-277c-4d4c-822a-98b6d2a91fcc","Type":"ContainerDied","Data":"8bd2b905a42ac2a7914349acf7483e6a96b627bc84262d46412438104a33559b"} Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.261237 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hkmv" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.262855 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" event={"ID":"0589998d-961b-4184-9884-0ad5eee48348","Type":"ContainerStarted","Data":"d0b030cf3d1ba71cbf1bcd87e466aab5732fd5cfe76f2c145956c007e57e1133"} Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.262896 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" event={"ID":"0589998d-961b-4184-9884-0ad5eee48348","Type":"ContainerStarted","Data":"5d97734c31076ed7a45d593686f973c26242af277552bdaadc4b9355a5b1c19c"} Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.263840 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.266433 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" event={"ID":"488c4b65-9dcd-4303-a4b1-4c640fa9e0dd","Type":"ContainerDied","Data":"5fea6fd1132bfc00969a6558a6ed8706318829514fdd79249675c0c2c8d0b8dd"} Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.266576 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xgrs4" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.270973 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.290419 4792 scope.go:117] "RemoveContainer" containerID="0a2450ea3aee9ffbc6ba511e5ba06e3f63e426a3d5a7691de28f7994af64228c" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.291826 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-m64ct" podStartSLOduration=2.291815599 podStartE2EDuration="2.291815599s" podCreationTimestamp="2026-03-09 09:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:14:31.291121941 +0000 UTC m=+436.321322683" watchObservedRunningTime="2026-03-09 09:14:31.291815599 +0000 UTC m=+436.322016351" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.323797 4792 scope.go:117] "RemoveContainer" containerID="347d9ce1a7155a77f88a083b877b6c70fe58fb8197d05530eaf63170c0c44239" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.352665 4792 scope.go:117] "RemoveContainer" containerID="29997cf552b1be261417eea2595d97e67a0cea55f8ee28656b602e5aee87155c" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.375269 4792 scope.go:117] "RemoveContainer" containerID="815164b353e5081b24d2413cf5c92d3a572812d2d8925e1e5317af231c0d02f7" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.375598 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jsnbn"] Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.379809 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jsnbn"] Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.391611 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hkmv"] Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.396870 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8hkmv"] Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.400168 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgrs4"] Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.402455 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xgrs4"] Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.406758 4792 scope.go:117] "RemoveContainer" containerID="673e39c1476b6f6837498d128df6d61369642de88ed9d4adf2f1d891294db1d0" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.423299 4792 scope.go:117] "RemoveContainer" containerID="2879877d0bda1b71228696a57f9ec7b1e55bbb5d2e1079338571a2cffc01949a" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.667571 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="047810b2-277c-4d4c-822a-98b6d2a91fcc" path="/var/lib/kubelet/pods/047810b2-277c-4d4c-822a-98b6d2a91fcc/volumes" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.668418 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0667075c-38b7-4fb6-ad69-a31987eae3cc" path="/var/lib/kubelet/pods/0667075c-38b7-4fb6-ad69-a31987eae3cc/volumes" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.669157 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" path="/var/lib/kubelet/pods/488c4b65-9dcd-4303-a4b1-4c640fa9e0dd/volumes" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.670237 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69535a14-c11d-442a-837d-f1d6744cb530" path="/var/lib/kubelet/pods/69535a14-c11d-442a-837d-f1d6744cb530/volumes" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.671206 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8606aa7f-7b07-40df-b9b8-f415a5e68b47" path="/var/lib/kubelet/pods/8606aa7f-7b07-40df-b9b8-f415a5e68b47/volumes" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.928703 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2rnrf"] Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.928890 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69535a14-c11d-442a-837d-f1d6744cb530" containerName="extract-content" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.928902 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="69535a14-c11d-442a-837d-f1d6744cb530" containerName="extract-content" Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.928913 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8606aa7f-7b07-40df-b9b8-f415a5e68b47" containerName="extract-content" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.928919 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8606aa7f-7b07-40df-b9b8-f415a5e68b47" containerName="extract-content" Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.928926 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047810b2-277c-4d4c-822a-98b6d2a91fcc" containerName="extract-utilities" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.928932 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="047810b2-277c-4d4c-822a-98b6d2a91fcc" containerName="extract-utilities" Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.928939 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047810b2-277c-4d4c-822a-98b6d2a91fcc" containerName="registry-server" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.928945 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="047810b2-277c-4d4c-822a-98b6d2a91fcc" containerName="registry-server" Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.928953 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8606aa7f-7b07-40df-b9b8-f415a5e68b47" containerName="extract-utilities" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.928959 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8606aa7f-7b07-40df-b9b8-f415a5e68b47" containerName="extract-utilities" Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.928967 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69535a14-c11d-442a-837d-f1d6744cb530" containerName="registry-server" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.928973 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="69535a14-c11d-442a-837d-f1d6744cb530" containerName="registry-server" Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.928982 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047810b2-277c-4d4c-822a-98b6d2a91fcc" containerName="extract-content" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.928988 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="047810b2-277c-4d4c-822a-98b6d2a91fcc" containerName="extract-content" Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.928996 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0667075c-38b7-4fb6-ad69-a31987eae3cc" containerName="registry-server" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929001 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0667075c-38b7-4fb6-ad69-a31987eae3cc" containerName="registry-server" Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.929039 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0667075c-38b7-4fb6-ad69-a31987eae3cc" containerName="extract-content" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929045 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0667075c-38b7-4fb6-ad69-a31987eae3cc" containerName="extract-content" Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.929055 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69535a14-c11d-442a-837d-f1d6744cb530" containerName="extract-utilities" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929060 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="69535a14-c11d-442a-837d-f1d6744cb530" containerName="extract-utilities" Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.929081 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" containerName="marketplace-operator" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929089 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" containerName="marketplace-operator" Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.929099 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" containerName="marketplace-operator" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929106 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" containerName="marketplace-operator" Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.929114 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8606aa7f-7b07-40df-b9b8-f415a5e68b47" containerName="registry-server" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929119 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8606aa7f-7b07-40df-b9b8-f415a5e68b47" containerName="registry-server" Mar 09 09:14:31 crc kubenswrapper[4792]: E0309 09:14:31.929127 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0667075c-38b7-4fb6-ad69-a31987eae3cc" containerName="extract-utilities" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929134 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0667075c-38b7-4fb6-ad69-a31987eae3cc" containerName="extract-utilities" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929216 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="047810b2-277c-4d4c-822a-98b6d2a91fcc" containerName="registry-server" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929223 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" containerName="marketplace-operator" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929235 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0667075c-38b7-4fb6-ad69-a31987eae3cc" containerName="registry-server" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929242 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8606aa7f-7b07-40df-b9b8-f415a5e68b47" containerName="registry-server" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929251 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="69535a14-c11d-442a-837d-f1d6744cb530" containerName="registry-server" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929412 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="488c4b65-9dcd-4303-a4b1-4c640fa9e0dd" containerName="marketplace-operator" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.929887 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.933762 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 09:14:31 crc kubenswrapper[4792]: I0309 09:14:31.942429 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rnrf"] Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.097028 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ac8fc64-583a-420b-b356-cfa0491d9b6f-catalog-content\") pod \"redhat-marketplace-2rnrf\" (UID: \"6ac8fc64-583a-420b-b356-cfa0491d9b6f\") " pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.097123 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ac8fc64-583a-420b-b356-cfa0491d9b6f-utilities\") pod \"redhat-marketplace-2rnrf\" (UID: \"6ac8fc64-583a-420b-b356-cfa0491d9b6f\") " pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.097156 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrp97\" (UniqueName: \"kubernetes.io/projected/6ac8fc64-583a-420b-b356-cfa0491d9b6f-kube-api-access-rrp97\") pod \"redhat-marketplace-2rnrf\" (UID: \"6ac8fc64-583a-420b-b356-cfa0491d9b6f\") " pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.132007 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mkwxh"] Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.133772 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.135755 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.146957 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mkwxh"] Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.198501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ac8fc64-583a-420b-b356-cfa0491d9b6f-catalog-content\") pod \"redhat-marketplace-2rnrf\" (UID: \"6ac8fc64-583a-420b-b356-cfa0491d9b6f\") " pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.198577 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ac8fc64-583a-420b-b356-cfa0491d9b6f-utilities\") pod \"redhat-marketplace-2rnrf\" (UID: \"6ac8fc64-583a-420b-b356-cfa0491d9b6f\") " pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.198618 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrp97\" (UniqueName: \"kubernetes.io/projected/6ac8fc64-583a-420b-b356-cfa0491d9b6f-kube-api-access-rrp97\") pod \"redhat-marketplace-2rnrf\" (UID: \"6ac8fc64-583a-420b-b356-cfa0491d9b6f\") " pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.199029 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ac8fc64-583a-420b-b356-cfa0491d9b6f-utilities\") pod \"redhat-marketplace-2rnrf\" (UID: \"6ac8fc64-583a-420b-b356-cfa0491d9b6f\") " pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.199130 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ac8fc64-583a-420b-b356-cfa0491d9b6f-catalog-content\") pod \"redhat-marketplace-2rnrf\" (UID: \"6ac8fc64-583a-420b-b356-cfa0491d9b6f\") " pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.218246 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrp97\" (UniqueName: \"kubernetes.io/projected/6ac8fc64-583a-420b-b356-cfa0491d9b6f-kube-api-access-rrp97\") pod \"redhat-marketplace-2rnrf\" (UID: \"6ac8fc64-583a-420b-b356-cfa0491d9b6f\") " pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.255740 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.299843 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvmv8\" (UniqueName: \"kubernetes.io/projected/2bce5f9c-c863-4962-a276-2b5a3a69def9-kube-api-access-gvmv8\") pod \"certified-operators-mkwxh\" (UID: \"2bce5f9c-c863-4962-a276-2b5a3a69def9\") " pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.299915 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bce5f9c-c863-4962-a276-2b5a3a69def9-utilities\") pod \"certified-operators-mkwxh\" (UID: \"2bce5f9c-c863-4962-a276-2b5a3a69def9\") " pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.299938 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bce5f9c-c863-4962-a276-2b5a3a69def9-catalog-content\") pod \"certified-operators-mkwxh\" (UID: \"2bce5f9c-c863-4962-a276-2b5a3a69def9\") " pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.401655 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bce5f9c-c863-4962-a276-2b5a3a69def9-utilities\") pod \"certified-operators-mkwxh\" (UID: \"2bce5f9c-c863-4962-a276-2b5a3a69def9\") " pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.401713 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bce5f9c-c863-4962-a276-2b5a3a69def9-catalog-content\") pod \"certified-operators-mkwxh\" (UID: \"2bce5f9c-c863-4962-a276-2b5a3a69def9\") " pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.401749 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvmv8\" (UniqueName: \"kubernetes.io/projected/2bce5f9c-c863-4962-a276-2b5a3a69def9-kube-api-access-gvmv8\") pod \"certified-operators-mkwxh\" (UID: \"2bce5f9c-c863-4962-a276-2b5a3a69def9\") " pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.402663 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bce5f9c-c863-4962-a276-2b5a3a69def9-catalog-content\") pod \"certified-operators-mkwxh\" (UID: \"2bce5f9c-c863-4962-a276-2b5a3a69def9\") " pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.402873 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bce5f9c-c863-4962-a276-2b5a3a69def9-utilities\") pod \"certified-operators-mkwxh\" (UID: \"2bce5f9c-c863-4962-a276-2b5a3a69def9\") " pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.423380 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvmv8\" (UniqueName: \"kubernetes.io/projected/2bce5f9c-c863-4962-a276-2b5a3a69def9-kube-api-access-gvmv8\") pod \"certified-operators-mkwxh\" (UID: \"2bce5f9c-c863-4962-a276-2b5a3a69def9\") " pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.453784 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.612371 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mkwxh"] Mar 09 09:14:32 crc kubenswrapper[4792]: W0309 09:14:32.618838 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bce5f9c_c863_4962_a276_2b5a3a69def9.slice/crio-76caf74f44534759663a90d8cdc621a7747749d430b111d82391a35db93f0189 WatchSource:0}: Error finding container 76caf74f44534759663a90d8cdc621a7747749d430b111d82391a35db93f0189: Status 404 returned error can't find the container with id 76caf74f44534759663a90d8cdc621a7747749d430b111d82391a35db93f0189 Mar 09 09:14:32 crc kubenswrapper[4792]: I0309 09:14:32.643171 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rnrf"] Mar 09 09:14:32 crc kubenswrapper[4792]: W0309 09:14:32.651266 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ac8fc64_583a_420b_b356_cfa0491d9b6f.slice/crio-9e0eee3fa42d67cae4f6b9b4ec942e348c7325f3ce866fe6dcbc2caaed37b2d6 WatchSource:0}: Error finding container 9e0eee3fa42d67cae4f6b9b4ec942e348c7325f3ce866fe6dcbc2caaed37b2d6: Status 404 returned error can't find the container with id 9e0eee3fa42d67cae4f6b9b4ec942e348c7325f3ce866fe6dcbc2caaed37b2d6 Mar 09 09:14:33 crc kubenswrapper[4792]: I0309 09:14:33.287050 4792 generic.go:334] "Generic (PLEG): container finished" podID="2bce5f9c-c863-4962-a276-2b5a3a69def9" containerID="4cb361059dbea8023d77273385a04f7d7a6cfdcecbb464731da37b586cd81abd" exitCode=0 Mar 09 09:14:33 crc kubenswrapper[4792]: I0309 09:14:33.287544 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkwxh" event={"ID":"2bce5f9c-c863-4962-a276-2b5a3a69def9","Type":"ContainerDied","Data":"4cb361059dbea8023d77273385a04f7d7a6cfdcecbb464731da37b586cd81abd"} Mar 09 09:14:33 crc kubenswrapper[4792]: I0309 09:14:33.287573 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkwxh" event={"ID":"2bce5f9c-c863-4962-a276-2b5a3a69def9","Type":"ContainerStarted","Data":"76caf74f44534759663a90d8cdc621a7747749d430b111d82391a35db93f0189"} Mar 09 09:14:33 crc kubenswrapper[4792]: I0309 09:14:33.305416 4792 generic.go:334] "Generic (PLEG): container finished" podID="6ac8fc64-583a-420b-b356-cfa0491d9b6f" containerID="a9105ffe48b20d03fa7ba916f68e354eeb761fda4bf05beda4c90cc83e2c4474" exitCode=0 Mar 09 09:14:33 crc kubenswrapper[4792]: I0309 09:14:33.306599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rnrf" event={"ID":"6ac8fc64-583a-420b-b356-cfa0491d9b6f","Type":"ContainerDied","Data":"a9105ffe48b20d03fa7ba916f68e354eeb761fda4bf05beda4c90cc83e2c4474"} Mar 09 09:14:33 crc kubenswrapper[4792]: I0309 09:14:33.306627 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rnrf" event={"ID":"6ac8fc64-583a-420b-b356-cfa0491d9b6f","Type":"ContainerStarted","Data":"9e0eee3fa42d67cae4f6b9b4ec942e348c7325f3ce866fe6dcbc2caaed37b2d6"} Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.286745 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dj9wv"] Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.287479 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.312476 4792 generic.go:334] "Generic (PLEG): container finished" podID="6ac8fc64-583a-420b-b356-cfa0491d9b6f" containerID="f024a9175af0db8d4bba93fcf51e88a85f0711bb1d9a11357e75cba3be6b68a7" exitCode=0 Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.312779 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rnrf" event={"ID":"6ac8fc64-583a-420b-b356-cfa0491d9b6f","Type":"ContainerDied","Data":"f024a9175af0db8d4bba93fcf51e88a85f0711bb1d9a11357e75cba3be6b68a7"} Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.313429 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dj9wv"] Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.318494 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkwxh" event={"ID":"2bce5f9c-c863-4962-a276-2b5a3a69def9","Type":"ContainerStarted","Data":"cae4a82fd56ff81ae984ef0522247de1bb190e8a8af76ad20995b917652c5ac5"} Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.362336 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ms9zz"] Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.363234 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.367148 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.379617 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ms9zz"] Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.428931 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3855d8f-3750-433b-9638-6070f72a3d61-bound-sa-token\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.428976 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rswz\" (UniqueName: \"kubernetes.io/projected/c3855d8f-3750-433b-9638-6070f72a3d61-kube-api-access-9rswz\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.429037 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3855d8f-3750-433b-9638-6070f72a3d61-registry-tls\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.429057 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c3855d8f-3750-433b-9638-6070f72a3d61-registry-certificates\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.429127 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c3855d8f-3750-433b-9638-6070f72a3d61-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.429172 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.429216 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3855d8f-3750-433b-9638-6070f72a3d61-trusted-ca\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.429232 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c3855d8f-3750-433b-9638-6070f72a3d61-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.451987 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.530448 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r4pf7"] Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.530821 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwtkd\" (UniqueName: \"kubernetes.io/projected/a807ba61-7b14-443b-a870-6220b51d2bd6-kube-api-access-wwtkd\") pod \"redhat-operators-ms9zz\" (UID: \"a807ba61-7b14-443b-a870-6220b51d2bd6\") " pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.530934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3855d8f-3750-433b-9638-6070f72a3d61-trusted-ca\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.530979 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c3855d8f-3750-433b-9638-6070f72a3d61-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.531026 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3855d8f-3750-433b-9638-6070f72a3d61-bound-sa-token\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.531061 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rswz\" (UniqueName: \"kubernetes.io/projected/c3855d8f-3750-433b-9638-6070f72a3d61-kube-api-access-9rswz\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.531156 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a807ba61-7b14-443b-a870-6220b51d2bd6-utilities\") pod \"redhat-operators-ms9zz\" (UID: \"a807ba61-7b14-443b-a870-6220b51d2bd6\") " pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.531190 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a807ba61-7b14-443b-a870-6220b51d2bd6-catalog-content\") pod \"redhat-operators-ms9zz\" (UID: \"a807ba61-7b14-443b-a870-6220b51d2bd6\") " pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.531240 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3855d8f-3750-433b-9638-6070f72a3d61-registry-tls\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.531280 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c3855d8f-3750-433b-9638-6070f72a3d61-registry-certificates\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.531316 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c3855d8f-3750-433b-9638-6070f72a3d61-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.531459 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.533013 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c3855d8f-3750-433b-9638-6070f72a3d61-trusted-ca\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.533246 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c3855d8f-3750-433b-9638-6070f72a3d61-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.533973 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c3855d8f-3750-433b-9638-6070f72a3d61-registry-certificates\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.537171 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.540427 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c3855d8f-3750-433b-9638-6070f72a3d61-registry-tls\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.543870 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r4pf7"] Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.545052 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c3855d8f-3750-433b-9638-6070f72a3d61-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.571278 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rswz\" (UniqueName: \"kubernetes.io/projected/c3855d8f-3750-433b-9638-6070f72a3d61-kube-api-access-9rswz\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.571313 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c3855d8f-3750-433b-9638-6070f72a3d61-bound-sa-token\") pod \"image-registry-66df7c8f76-dj9wv\" (UID: \"c3855d8f-3750-433b-9638-6070f72a3d61\") " pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.605345 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.632907 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwtkd\" (UniqueName: \"kubernetes.io/projected/a807ba61-7b14-443b-a870-6220b51d2bd6-kube-api-access-wwtkd\") pod \"redhat-operators-ms9zz\" (UID: \"a807ba61-7b14-443b-a870-6220b51d2bd6\") " pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.632948 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7344810f-1120-46c5-af73-cd48477e1143-utilities\") pod \"community-operators-r4pf7\" (UID: \"7344810f-1120-46c5-af73-cd48477e1143\") " pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.632976 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82cbf\" (UniqueName: \"kubernetes.io/projected/7344810f-1120-46c5-af73-cd48477e1143-kube-api-access-82cbf\") pod \"community-operators-r4pf7\" (UID: \"7344810f-1120-46c5-af73-cd48477e1143\") " pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.633012 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a807ba61-7b14-443b-a870-6220b51d2bd6-utilities\") pod \"redhat-operators-ms9zz\" (UID: \"a807ba61-7b14-443b-a870-6220b51d2bd6\") " pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.633031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a807ba61-7b14-443b-a870-6220b51d2bd6-catalog-content\") pod \"redhat-operators-ms9zz\" (UID: \"a807ba61-7b14-443b-a870-6220b51d2bd6\") " pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.633100 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7344810f-1120-46c5-af73-cd48477e1143-catalog-content\") pod \"community-operators-r4pf7\" (UID: \"7344810f-1120-46c5-af73-cd48477e1143\") " pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.633504 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a807ba61-7b14-443b-a870-6220b51d2bd6-utilities\") pod \"redhat-operators-ms9zz\" (UID: \"a807ba61-7b14-443b-a870-6220b51d2bd6\") " pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.633828 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a807ba61-7b14-443b-a870-6220b51d2bd6-catalog-content\") pod \"redhat-operators-ms9zz\" (UID: \"a807ba61-7b14-443b-a870-6220b51d2bd6\") " pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.650630 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwtkd\" (UniqueName: \"kubernetes.io/projected/a807ba61-7b14-443b-a870-6220b51d2bd6-kube-api-access-wwtkd\") pod \"redhat-operators-ms9zz\" (UID: \"a807ba61-7b14-443b-a870-6220b51d2bd6\") " pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.679307 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.734014 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7344810f-1120-46c5-af73-cd48477e1143-utilities\") pod \"community-operators-r4pf7\" (UID: \"7344810f-1120-46c5-af73-cd48477e1143\") " pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.734098 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82cbf\" (UniqueName: \"kubernetes.io/projected/7344810f-1120-46c5-af73-cd48477e1143-kube-api-access-82cbf\") pod \"community-operators-r4pf7\" (UID: \"7344810f-1120-46c5-af73-cd48477e1143\") " pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.734192 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7344810f-1120-46c5-af73-cd48477e1143-catalog-content\") pod \"community-operators-r4pf7\" (UID: \"7344810f-1120-46c5-af73-cd48477e1143\") " pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.734766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7344810f-1120-46c5-af73-cd48477e1143-catalog-content\") pod \"community-operators-r4pf7\" (UID: \"7344810f-1120-46c5-af73-cd48477e1143\") " pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.734775 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7344810f-1120-46c5-af73-cd48477e1143-utilities\") pod \"community-operators-r4pf7\" (UID: \"7344810f-1120-46c5-af73-cd48477e1143\") " pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.751812 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82cbf\" (UniqueName: \"kubernetes.io/projected/7344810f-1120-46c5-af73-cd48477e1143-kube-api-access-82cbf\") pod \"community-operators-r4pf7\" (UID: \"7344810f-1120-46c5-af73-cd48477e1143\") " pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.899473 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ms9zz"] Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.966542 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:34 crc kubenswrapper[4792]: I0309 09:14:34.990124 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dj9wv"] Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.196530 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r4pf7"] Mar 09 09:14:35 crc kubenswrapper[4792]: W0309 09:14:35.200668 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7344810f_1120_46c5_af73_cd48477e1143.slice/crio-94bbc9628628a1cd1ca91968ca7634c7f27d43dc6f1c42ce6ac7059a7a132ec9 WatchSource:0}: Error finding container 94bbc9628628a1cd1ca91968ca7634c7f27d43dc6f1c42ce6ac7059a7a132ec9: Status 404 returned error can't find the container with id 94bbc9628628a1cd1ca91968ca7634c7f27d43dc6f1c42ce6ac7059a7a132ec9 Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.344282 4792 generic.go:334] "Generic (PLEG): container finished" podID="2bce5f9c-c863-4962-a276-2b5a3a69def9" containerID="cae4a82fd56ff81ae984ef0522247de1bb190e8a8af76ad20995b917652c5ac5" exitCode=0 Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.344340 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkwxh" event={"ID":"2bce5f9c-c863-4962-a276-2b5a3a69def9","Type":"ContainerDied","Data":"cae4a82fd56ff81ae984ef0522247de1bb190e8a8af76ad20995b917652c5ac5"} Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.352516 4792 generic.go:334] "Generic (PLEG): container finished" podID="a807ba61-7b14-443b-a870-6220b51d2bd6" containerID="66b05ec3b9f4b7d406c7be927cee29057954433aa2d767a199f3639d32eb8d82" exitCode=0 Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.352566 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ms9zz" event={"ID":"a807ba61-7b14-443b-a870-6220b51d2bd6","Type":"ContainerDied","Data":"66b05ec3b9f4b7d406c7be927cee29057954433aa2d767a199f3639d32eb8d82"} Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.352585 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ms9zz" event={"ID":"a807ba61-7b14-443b-a870-6220b51d2bd6","Type":"ContainerStarted","Data":"9dbaa566e04a749f399c7c5ac2af58ba7c84667acf4a7c8b50e9ab7aabe4452c"} Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.357603 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4pf7" event={"ID":"7344810f-1120-46c5-af73-cd48477e1143","Type":"ContainerStarted","Data":"271e3cb63dfa5cbef6ff4e64917563772ccc1da7d35974bb9c0dd927cb9ae7c8"} Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.357689 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4pf7" event={"ID":"7344810f-1120-46c5-af73-cd48477e1143","Type":"ContainerStarted","Data":"94bbc9628628a1cd1ca91968ca7634c7f27d43dc6f1c42ce6ac7059a7a132ec9"} Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.366134 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" event={"ID":"c3855d8f-3750-433b-9638-6070f72a3d61","Type":"ContainerStarted","Data":"3a798582dcba302e3c7b36067becf727c5a115f59e2b447bb787a9fd273745d8"} Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.366201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" event={"ID":"c3855d8f-3750-433b-9638-6070f72a3d61","Type":"ContainerStarted","Data":"df059c775d4b7fe01fec0b69338fc0fd0bb70103704f361c8e2c9d841937cbd5"} Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.367234 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.371707 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rnrf" event={"ID":"6ac8fc64-583a-420b-b356-cfa0491d9b6f","Type":"ContainerStarted","Data":"988e96b4c701172b26ce9d364293b4c2f01d4c1984696ddf3ee6f2d28fabb992"} Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.414143 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" podStartSLOduration=1.4141200870000001 podStartE2EDuration="1.414120087s" podCreationTimestamp="2026-03-09 09:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:14:35.412564996 +0000 UTC m=+440.442765758" watchObservedRunningTime="2026-03-09 09:14:35.414120087 +0000 UTC m=+440.444320839" Mar 09 09:14:35 crc kubenswrapper[4792]: I0309 09:14:35.458232 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2rnrf" podStartSLOduration=2.565020189 podStartE2EDuration="4.458211369s" podCreationTimestamp="2026-03-09 09:14:31 +0000 UTC" firstStartedPulling="2026-03-09 09:14:33.311256649 +0000 UTC m=+438.341457431" lastFinishedPulling="2026-03-09 09:14:35.204447859 +0000 UTC m=+440.234648611" observedRunningTime="2026-03-09 09:14:35.453856944 +0000 UTC m=+440.484057706" watchObservedRunningTime="2026-03-09 09:14:35.458211369 +0000 UTC m=+440.488412121" Mar 09 09:14:36 crc kubenswrapper[4792]: I0309 09:14:36.379579 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mkwxh" event={"ID":"2bce5f9c-c863-4962-a276-2b5a3a69def9","Type":"ContainerStarted","Data":"d0efe33a5b5f4a1eb796f26f47ca7458130e82ee95fa5876c19d7d9913ca11c5"} Mar 09 09:14:36 crc kubenswrapper[4792]: I0309 09:14:36.391887 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ms9zz" event={"ID":"a807ba61-7b14-443b-a870-6220b51d2bd6","Type":"ContainerStarted","Data":"ff7687060a8ed54ccd4bf6336187531a606e1821d77161df6ca23a8d6924a8b4"} Mar 09 09:14:36 crc kubenswrapper[4792]: I0309 09:14:36.394478 4792 generic.go:334] "Generic (PLEG): container finished" podID="7344810f-1120-46c5-af73-cd48477e1143" containerID="271e3cb63dfa5cbef6ff4e64917563772ccc1da7d35974bb9c0dd927cb9ae7c8" exitCode=0 Mar 09 09:14:36 crc kubenswrapper[4792]: I0309 09:14:36.394680 4792 generic.go:334] "Generic (PLEG): container finished" podID="7344810f-1120-46c5-af73-cd48477e1143" containerID="482a7215f07b106d54aafd366fd817aabed5d5f936a4577645d2d03c20ffc652" exitCode=0 Mar 09 09:14:36 crc kubenswrapper[4792]: I0309 09:14:36.394659 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4pf7" event={"ID":"7344810f-1120-46c5-af73-cd48477e1143","Type":"ContainerDied","Data":"271e3cb63dfa5cbef6ff4e64917563772ccc1da7d35974bb9c0dd927cb9ae7c8"} Mar 09 09:14:36 crc kubenswrapper[4792]: I0309 09:14:36.395115 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4pf7" event={"ID":"7344810f-1120-46c5-af73-cd48477e1143","Type":"ContainerDied","Data":"482a7215f07b106d54aafd366fd817aabed5d5f936a4577645d2d03c20ffc652"} Mar 09 09:14:36 crc kubenswrapper[4792]: I0309 09:14:36.410944 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mkwxh" podStartSLOduration=1.897050331 podStartE2EDuration="4.410927263s" podCreationTimestamp="2026-03-09 09:14:32 +0000 UTC" firstStartedPulling="2026-03-09 09:14:33.290463425 +0000 UTC m=+438.320664177" lastFinishedPulling="2026-03-09 09:14:35.804340357 +0000 UTC m=+440.834541109" observedRunningTime="2026-03-09 09:14:36.406262158 +0000 UTC m=+441.436462910" watchObservedRunningTime="2026-03-09 09:14:36.410927263 +0000 UTC m=+441.441128015" Mar 09 09:14:37 crc kubenswrapper[4792]: I0309 09:14:37.401835 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4pf7" event={"ID":"7344810f-1120-46c5-af73-cd48477e1143","Type":"ContainerStarted","Data":"525a735387e8304627db5de420d3fdb43e2c065b58f491effb834cad16c16b18"} Mar 09 09:14:37 crc kubenswrapper[4792]: I0309 09:14:37.403456 4792 generic.go:334] "Generic (PLEG): container finished" podID="a807ba61-7b14-443b-a870-6220b51d2bd6" containerID="ff7687060a8ed54ccd4bf6336187531a606e1821d77161df6ca23a8d6924a8b4" exitCode=0 Mar 09 09:14:37 crc kubenswrapper[4792]: I0309 09:14:37.403528 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ms9zz" event={"ID":"a807ba61-7b14-443b-a870-6220b51d2bd6","Type":"ContainerDied","Data":"ff7687060a8ed54ccd4bf6336187531a606e1821d77161df6ca23a8d6924a8b4"} Mar 09 09:14:37 crc kubenswrapper[4792]: I0309 09:14:37.427459 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r4pf7" podStartSLOduration=1.7837526590000001 podStartE2EDuration="3.427443663s" podCreationTimestamp="2026-03-09 09:14:34 +0000 UTC" firstStartedPulling="2026-03-09 09:14:35.358562139 +0000 UTC m=+440.388762881" lastFinishedPulling="2026-03-09 09:14:37.002253143 +0000 UTC m=+442.032453885" observedRunningTime="2026-03-09 09:14:37.421138675 +0000 UTC m=+442.451339437" watchObservedRunningTime="2026-03-09 09:14:37.427443663 +0000 UTC m=+442.457644415" Mar 09 09:14:38 crc kubenswrapper[4792]: I0309 09:14:38.410976 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ms9zz" event={"ID":"a807ba61-7b14-443b-a870-6220b51d2bd6","Type":"ContainerStarted","Data":"50328db96ea96f3873f96bde65e23a8bd56db9885218d59b44ec8f7169b19830"} Mar 09 09:14:38 crc kubenswrapper[4792]: I0309 09:14:38.437965 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ms9zz" podStartSLOduration=1.978078298 podStartE2EDuration="4.437950973s" podCreationTimestamp="2026-03-09 09:14:34 +0000 UTC" firstStartedPulling="2026-03-09 09:14:35.355003074 +0000 UTC m=+440.385203826" lastFinishedPulling="2026-03-09 09:14:37.814875749 +0000 UTC m=+442.845076501" observedRunningTime="2026-03-09 09:14:38.433437143 +0000 UTC m=+443.463637895" watchObservedRunningTime="2026-03-09 09:14:38.437950973 +0000 UTC m=+443.468151725" Mar 09 09:14:42 crc kubenswrapper[4792]: I0309 09:14:42.255858 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:42 crc kubenswrapper[4792]: I0309 09:14:42.255960 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:42 crc kubenswrapper[4792]: I0309 09:14:42.302281 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:42 crc kubenswrapper[4792]: I0309 09:14:42.454538 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:42 crc kubenswrapper[4792]: I0309 09:14:42.454973 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:42 crc kubenswrapper[4792]: I0309 09:14:42.470598 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2rnrf" Mar 09 09:14:42 crc kubenswrapper[4792]: I0309 09:14:42.496687 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:43 crc kubenswrapper[4792]: I0309 09:14:43.214401 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:14:43 crc kubenswrapper[4792]: I0309 09:14:43.214467 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:14:43 crc kubenswrapper[4792]: I0309 09:14:43.473515 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mkwxh" Mar 09 09:14:44 crc kubenswrapper[4792]: I0309 09:14:44.680194 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:44 crc kubenswrapper[4792]: I0309 09:14:44.680253 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:44 crc kubenswrapper[4792]: I0309 09:14:44.731384 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:44 crc kubenswrapper[4792]: I0309 09:14:44.967034 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:44 crc kubenswrapper[4792]: I0309 09:14:44.967171 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:45 crc kubenswrapper[4792]: I0309 09:14:45.005147 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:45 crc kubenswrapper[4792]: I0309 09:14:45.507102 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:14:45 crc kubenswrapper[4792]: I0309 09:14:45.517291 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ms9zz" Mar 09 09:14:54 crc kubenswrapper[4792]: I0309 09:14:54.611338 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-dj9wv" Mar 09 09:14:54 crc kubenswrapper[4792]: I0309 09:14:54.663890 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kkrgv"] Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.135122 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz"] Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.136237 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.141502 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.141816 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.151037 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz"] Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.296802 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6chdw\" (UniqueName: \"kubernetes.io/projected/37afd809-afca-4aa4-a9c9-3fe02f105c23-kube-api-access-6chdw\") pod \"collect-profiles-29550795-8jtvz\" (UID: \"37afd809-afca-4aa4-a9c9-3fe02f105c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.297233 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37afd809-afca-4aa4-a9c9-3fe02f105c23-secret-volume\") pod \"collect-profiles-29550795-8jtvz\" (UID: \"37afd809-afca-4aa4-a9c9-3fe02f105c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.297355 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37afd809-afca-4aa4-a9c9-3fe02f105c23-config-volume\") pod \"collect-profiles-29550795-8jtvz\" (UID: \"37afd809-afca-4aa4-a9c9-3fe02f105c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.398936 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6chdw\" (UniqueName: \"kubernetes.io/projected/37afd809-afca-4aa4-a9c9-3fe02f105c23-kube-api-access-6chdw\") pod \"collect-profiles-29550795-8jtvz\" (UID: \"37afd809-afca-4aa4-a9c9-3fe02f105c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.399001 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37afd809-afca-4aa4-a9c9-3fe02f105c23-secret-volume\") pod \"collect-profiles-29550795-8jtvz\" (UID: \"37afd809-afca-4aa4-a9c9-3fe02f105c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.399041 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37afd809-afca-4aa4-a9c9-3fe02f105c23-config-volume\") pod \"collect-profiles-29550795-8jtvz\" (UID: \"37afd809-afca-4aa4-a9c9-3fe02f105c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.399862 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37afd809-afca-4aa4-a9c9-3fe02f105c23-config-volume\") pod \"collect-profiles-29550795-8jtvz\" (UID: \"37afd809-afca-4aa4-a9c9-3fe02f105c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.405510 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37afd809-afca-4aa4-a9c9-3fe02f105c23-secret-volume\") pod \"collect-profiles-29550795-8jtvz\" (UID: \"37afd809-afca-4aa4-a9c9-3fe02f105c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.419923 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6chdw\" (UniqueName: \"kubernetes.io/projected/37afd809-afca-4aa4-a9c9-3fe02f105c23-kube-api-access-6chdw\") pod \"collect-profiles-29550795-8jtvz\" (UID: \"37afd809-afca-4aa4-a9c9-3fe02f105c23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.452754 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" Mar 09 09:15:00 crc kubenswrapper[4792]: I0309 09:15:00.637770 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz"] Mar 09 09:15:01 crc kubenswrapper[4792]: I0309 09:15:01.774903 4792 generic.go:334] "Generic (PLEG): container finished" podID="37afd809-afca-4aa4-a9c9-3fe02f105c23" containerID="99ce20c6de306eeb3ba18702bf7780f93634cfb30b3a7af923b5c1e36f1cc492" exitCode=0 Mar 09 09:15:01 crc kubenswrapper[4792]: I0309 09:15:01.775000 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" event={"ID":"37afd809-afca-4aa4-a9c9-3fe02f105c23","Type":"ContainerDied","Data":"99ce20c6de306eeb3ba18702bf7780f93634cfb30b3a7af923b5c1e36f1cc492"} Mar 09 09:15:01 crc kubenswrapper[4792]: I0309 09:15:01.775483 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" event={"ID":"37afd809-afca-4aa4-a9c9-3fe02f105c23","Type":"ContainerStarted","Data":"509b121dd7eb9f01aa6706b658075b7c4662f1c081ac4a84d1b8d68d317be5a9"} Mar 09 09:15:03 crc kubenswrapper[4792]: I0309 09:15:03.382783 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" Mar 09 09:15:03 crc kubenswrapper[4792]: I0309 09:15:03.389182 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37afd809-afca-4aa4-a9c9-3fe02f105c23-config-volume\") pod \"37afd809-afca-4aa4-a9c9-3fe02f105c23\" (UID: \"37afd809-afca-4aa4-a9c9-3fe02f105c23\") " Mar 09 09:15:03 crc kubenswrapper[4792]: I0309 09:15:03.389281 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37afd809-afca-4aa4-a9c9-3fe02f105c23-secret-volume\") pod \"37afd809-afca-4aa4-a9c9-3fe02f105c23\" (UID: \"37afd809-afca-4aa4-a9c9-3fe02f105c23\") " Mar 09 09:15:03 crc kubenswrapper[4792]: I0309 09:15:03.389358 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6chdw\" (UniqueName: \"kubernetes.io/projected/37afd809-afca-4aa4-a9c9-3fe02f105c23-kube-api-access-6chdw\") pod \"37afd809-afca-4aa4-a9c9-3fe02f105c23\" (UID: \"37afd809-afca-4aa4-a9c9-3fe02f105c23\") " Mar 09 09:15:03 crc kubenswrapper[4792]: I0309 09:15:03.389922 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37afd809-afca-4aa4-a9c9-3fe02f105c23-config-volume" (OuterVolumeSpecName: "config-volume") pod "37afd809-afca-4aa4-a9c9-3fe02f105c23" (UID: "37afd809-afca-4aa4-a9c9-3fe02f105c23"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:15:03 crc kubenswrapper[4792]: I0309 09:15:03.394788 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37afd809-afca-4aa4-a9c9-3fe02f105c23-kube-api-access-6chdw" (OuterVolumeSpecName: "kube-api-access-6chdw") pod "37afd809-afca-4aa4-a9c9-3fe02f105c23" (UID: "37afd809-afca-4aa4-a9c9-3fe02f105c23"). InnerVolumeSpecName "kube-api-access-6chdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:15:03 crc kubenswrapper[4792]: I0309 09:15:03.397438 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37afd809-afca-4aa4-a9c9-3fe02f105c23-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "37afd809-afca-4aa4-a9c9-3fe02f105c23" (UID: "37afd809-afca-4aa4-a9c9-3fe02f105c23"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:15:03 crc kubenswrapper[4792]: I0309 09:15:03.490858 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6chdw\" (UniqueName: \"kubernetes.io/projected/37afd809-afca-4aa4-a9c9-3fe02f105c23-kube-api-access-6chdw\") on node \"crc\" DevicePath \"\"" Mar 09 09:15:03 crc kubenswrapper[4792]: I0309 09:15:03.490890 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37afd809-afca-4aa4-a9c9-3fe02f105c23-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:15:03 crc kubenswrapper[4792]: I0309 09:15:03.490899 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37afd809-afca-4aa4-a9c9-3fe02f105c23-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:15:03 crc kubenswrapper[4792]: I0309 09:15:03.785657 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" event={"ID":"37afd809-afca-4aa4-a9c9-3fe02f105c23","Type":"ContainerDied","Data":"509b121dd7eb9f01aa6706b658075b7c4662f1c081ac4a84d1b8d68d317be5a9"} Mar 09 09:15:03 crc kubenswrapper[4792]: I0309 09:15:03.785912 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="509b121dd7eb9f01aa6706b658075b7c4662f1c081ac4a84d1b8d68d317be5a9" Mar 09 09:15:03 crc kubenswrapper[4792]: I0309 09:15:03.785771 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz" Mar 09 09:15:13 crc kubenswrapper[4792]: I0309 09:15:13.215390 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:15:13 crc kubenswrapper[4792]: I0309 09:15:13.216047 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:15:13 crc kubenswrapper[4792]: I0309 09:15:13.216124 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:15:13 crc kubenswrapper[4792]: I0309 09:15:13.216727 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"72d2b48cb291f76d23e0eb58c24e11749ce55457c1a4bd60f9c4519440ee7870"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:15:13 crc kubenswrapper[4792]: I0309 09:15:13.216826 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://72d2b48cb291f76d23e0eb58c24e11749ce55457c1a4bd60f9c4519440ee7870" gracePeriod=600 Mar 09 09:15:13 crc kubenswrapper[4792]: I0309 09:15:13.866833 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="72d2b48cb291f76d23e0eb58c24e11749ce55457c1a4bd60f9c4519440ee7870" exitCode=0 Mar 09 09:15:13 crc kubenswrapper[4792]: I0309 09:15:13.866878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"72d2b48cb291f76d23e0eb58c24e11749ce55457c1a4bd60f9c4519440ee7870"} Mar 09 09:15:13 crc kubenswrapper[4792]: I0309 09:15:13.867204 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"6e17fbe8c1658cdfd2034c0251ad7b1ba86ff152d37adcf43ef52a7d2b0de4db"} Mar 09 09:15:13 crc kubenswrapper[4792]: I0309 09:15:13.867226 4792 scope.go:117] "RemoveContainer" containerID="d060627a577507a2b0030b6aea753d50e0c6766ac4876d95ac5d9d3401f9b818" Mar 09 09:15:19 crc kubenswrapper[4792]: I0309 09:15:19.736292 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" podUID="e5b8826d-81fe-4d43-9177-33e8e34ca003" containerName="registry" containerID="cri-o://0cf5c49c1646fbd8845a08b2fbd7f0c2d61d3d205b399b69e8d659bbb23ed144" gracePeriod=30 Mar 09 09:15:19 crc kubenswrapper[4792]: I0309 09:15:19.913879 4792 generic.go:334] "Generic (PLEG): container finished" podID="e5b8826d-81fe-4d43-9177-33e8e34ca003" containerID="0cf5c49c1646fbd8845a08b2fbd7f0c2d61d3d205b399b69e8d659bbb23ed144" exitCode=0 Mar 09 09:15:19 crc kubenswrapper[4792]: I0309 09:15:19.913952 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" event={"ID":"e5b8826d-81fe-4d43-9177-33e8e34ca003","Type":"ContainerDied","Data":"0cf5c49c1646fbd8845a08b2fbd7f0c2d61d3d205b399b69e8d659bbb23ed144"} Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.204971 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.292293 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5b8826d-81fe-4d43-9177-33e8e34ca003-trusted-ca\") pod \"e5b8826d-81fe-4d43-9177-33e8e34ca003\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.292329 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5b8826d-81fe-4d43-9177-33e8e34ca003-registry-certificates\") pod \"e5b8826d-81fe-4d43-9177-33e8e34ca003\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.292351 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5b8826d-81fe-4d43-9177-33e8e34ca003-ca-trust-extracted\") pod \"e5b8826d-81fe-4d43-9177-33e8e34ca003\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.292402 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-registry-tls\") pod \"e5b8826d-81fe-4d43-9177-33e8e34ca003\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.293227 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5b8826d-81fe-4d43-9177-33e8e34ca003-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e5b8826d-81fe-4d43-9177-33e8e34ca003" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.293291 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flkwq\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-kube-api-access-flkwq\") pod \"e5b8826d-81fe-4d43-9177-33e8e34ca003\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.293349 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-bound-sa-token\") pod \"e5b8826d-81fe-4d43-9177-33e8e34ca003\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.293382 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5b8826d-81fe-4d43-9177-33e8e34ca003-installation-pull-secrets\") pod \"e5b8826d-81fe-4d43-9177-33e8e34ca003\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.293513 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e5b8826d-81fe-4d43-9177-33e8e34ca003\" (UID: \"e5b8826d-81fe-4d43-9177-33e8e34ca003\") " Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.293522 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5b8826d-81fe-4d43-9177-33e8e34ca003-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e5b8826d-81fe-4d43-9177-33e8e34ca003" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.293720 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5b8826d-81fe-4d43-9177-33e8e34ca003-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.293736 4792 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5b8826d-81fe-4d43-9177-33e8e34ca003-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.299684 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b8826d-81fe-4d43-9177-33e8e34ca003-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e5b8826d-81fe-4d43-9177-33e8e34ca003" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.301712 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e5b8826d-81fe-4d43-9177-33e8e34ca003" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.302287 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-kube-api-access-flkwq" (OuterVolumeSpecName: "kube-api-access-flkwq") pod "e5b8826d-81fe-4d43-9177-33e8e34ca003" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003"). InnerVolumeSpecName "kube-api-access-flkwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.303436 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e5b8826d-81fe-4d43-9177-33e8e34ca003" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.304405 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e5b8826d-81fe-4d43-9177-33e8e34ca003" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.312260 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5b8826d-81fe-4d43-9177-33e8e34ca003-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e5b8826d-81fe-4d43-9177-33e8e34ca003" (UID: "e5b8826d-81fe-4d43-9177-33e8e34ca003"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.394428 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flkwq\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-kube-api-access-flkwq\") on node \"crc\" DevicePath \"\"" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.394648 4792 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.394733 4792 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5b8826d-81fe-4d43-9177-33e8e34ca003-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.394804 4792 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5b8826d-81fe-4d43-9177-33e8e34ca003-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.394871 4792 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5b8826d-81fe-4d43-9177-33e8e34ca003-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.922289 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" event={"ID":"e5b8826d-81fe-4d43-9177-33e8e34ca003","Type":"ContainerDied","Data":"c05bac22bf2dc182e07d6c819987702fb7310ad04e1b8ed60575c583241eb583"} Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.922359 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kkrgv" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.922364 4792 scope.go:117] "RemoveContainer" containerID="0cf5c49c1646fbd8845a08b2fbd7f0c2d61d3d205b399b69e8d659bbb23ed144" Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.952904 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kkrgv"] Mar 09 09:15:20 crc kubenswrapper[4792]: I0309 09:15:20.961927 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kkrgv"] Mar 09 09:15:21 crc kubenswrapper[4792]: I0309 09:15:21.670623 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5b8826d-81fe-4d43-9177-33e8e34ca003" path="/var/lib/kubelet/pods/e5b8826d-81fe-4d43-9177-33e8e34ca003/volumes" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.145704 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550796-kjk86"] Mar 09 09:16:00 crc kubenswrapper[4792]: E0309 09:16:00.148147 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b8826d-81fe-4d43-9177-33e8e34ca003" containerName="registry" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.148330 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b8826d-81fe-4d43-9177-33e8e34ca003" containerName="registry" Mar 09 09:16:00 crc kubenswrapper[4792]: E0309 09:16:00.148484 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37afd809-afca-4aa4-a9c9-3fe02f105c23" containerName="collect-profiles" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.148619 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="37afd809-afca-4aa4-a9c9-3fe02f105c23" containerName="collect-profiles" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.148958 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5b8826d-81fe-4d43-9177-33e8e34ca003" containerName="registry" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.149150 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="37afd809-afca-4aa4-a9c9-3fe02f105c23" containerName="collect-profiles" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.149909 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550796-kjk86" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.153810 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.154314 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.159166 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.159613 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550796-kjk86"] Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.252222 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9mgg\" (UniqueName: \"kubernetes.io/projected/6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92-kube-api-access-n9mgg\") pod \"auto-csr-approver-29550796-kjk86\" (UID: \"6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92\") " pod="openshift-infra/auto-csr-approver-29550796-kjk86" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.353596 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9mgg\" (UniqueName: \"kubernetes.io/projected/6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92-kube-api-access-n9mgg\") pod \"auto-csr-approver-29550796-kjk86\" (UID: \"6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92\") " pod="openshift-infra/auto-csr-approver-29550796-kjk86" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.371166 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9mgg\" (UniqueName: \"kubernetes.io/projected/6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92-kube-api-access-n9mgg\") pod \"auto-csr-approver-29550796-kjk86\" (UID: \"6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92\") " pod="openshift-infra/auto-csr-approver-29550796-kjk86" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.472816 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550796-kjk86" Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.650896 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550796-kjk86"] Mar 09 09:16:00 crc kubenswrapper[4792]: I0309 09:16:00.661650 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:16:01 crc kubenswrapper[4792]: I0309 09:16:01.179516 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550796-kjk86" event={"ID":"6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92","Type":"ContainerStarted","Data":"4888c807af6cb1e8e8977565ee5a5d18df99e0feaf5851340dced331888f07ed"} Mar 09 09:16:02 crc kubenswrapper[4792]: I0309 09:16:02.185879 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550796-kjk86" event={"ID":"6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92","Type":"ContainerStarted","Data":"5be8783cce265fa21ed1012182d0b76ce654470b581e06540b32ebb82bc52e8e"} Mar 09 09:16:02 crc kubenswrapper[4792]: I0309 09:16:02.200152 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550796-kjk86" podStartSLOduration=1.205296298 podStartE2EDuration="2.200132826s" podCreationTimestamp="2026-03-09 09:16:00 +0000 UTC" firstStartedPulling="2026-03-09 09:16:00.661158475 +0000 UTC m=+525.691359227" lastFinishedPulling="2026-03-09 09:16:01.655995003 +0000 UTC m=+526.686195755" observedRunningTime="2026-03-09 09:16:02.197971624 +0000 UTC m=+527.228172376" watchObservedRunningTime="2026-03-09 09:16:02.200132826 +0000 UTC m=+527.230333588" Mar 09 09:16:03 crc kubenswrapper[4792]: I0309 09:16:03.194496 4792 generic.go:334] "Generic (PLEG): container finished" podID="6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92" containerID="5be8783cce265fa21ed1012182d0b76ce654470b581e06540b32ebb82bc52e8e" exitCode=0 Mar 09 09:16:03 crc kubenswrapper[4792]: I0309 09:16:03.194585 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550796-kjk86" event={"ID":"6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92","Type":"ContainerDied","Data":"5be8783cce265fa21ed1012182d0b76ce654470b581e06540b32ebb82bc52e8e"} Mar 09 09:16:04 crc kubenswrapper[4792]: I0309 09:16:04.414379 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550796-kjk86" Mar 09 09:16:04 crc kubenswrapper[4792]: I0309 09:16:04.511334 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9mgg\" (UniqueName: \"kubernetes.io/projected/6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92-kube-api-access-n9mgg\") pod \"6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92\" (UID: \"6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92\") " Mar 09 09:16:04 crc kubenswrapper[4792]: I0309 09:16:04.518327 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92-kube-api-access-n9mgg" (OuterVolumeSpecName: "kube-api-access-n9mgg") pod "6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92" (UID: "6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92"). InnerVolumeSpecName "kube-api-access-n9mgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:16:04 crc kubenswrapper[4792]: I0309 09:16:04.612956 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9mgg\" (UniqueName: \"kubernetes.io/projected/6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92-kube-api-access-n9mgg\") on node \"crc\" DevicePath \"\"" Mar 09 09:16:05 crc kubenswrapper[4792]: I0309 09:16:05.207873 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550796-kjk86" event={"ID":"6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92","Type":"ContainerDied","Data":"4888c807af6cb1e8e8977565ee5a5d18df99e0feaf5851340dced331888f07ed"} Mar 09 09:16:05 crc kubenswrapper[4792]: I0309 09:16:05.207913 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4888c807af6cb1e8e8977565ee5a5d18df99e0feaf5851340dced331888f07ed" Mar 09 09:16:05 crc kubenswrapper[4792]: I0309 09:16:05.207939 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550796-kjk86" Mar 09 09:16:05 crc kubenswrapper[4792]: I0309 09:16:05.262195 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550790-q6wbp"] Mar 09 09:16:05 crc kubenswrapper[4792]: I0309 09:16:05.271556 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550790-q6wbp"] Mar 09 09:16:05 crc kubenswrapper[4792]: I0309 09:16:05.678906 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3934c8c-f197-4ef6-ac5c-76560a192e50" path="/var/lib/kubelet/pods/d3934c8c-f197-4ef6-ac5c-76560a192e50/volumes" Mar 09 09:17:13 crc kubenswrapper[4792]: I0309 09:17:13.215944 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:17:13 crc kubenswrapper[4792]: I0309 09:17:13.216692 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:17:43 crc kubenswrapper[4792]: I0309 09:17:43.214937 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:17:43 crc kubenswrapper[4792]: I0309 09:17:43.215707 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.148534 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550798-vnc97"] Mar 09 09:18:00 crc kubenswrapper[4792]: E0309 09:18:00.149277 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92" containerName="oc" Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.149294 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92" containerName="oc" Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.149430 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92" containerName="oc" Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.149995 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550798-vnc97" Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.153314 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.153739 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.161385 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.173665 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550798-vnc97"] Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.346708 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdnxs\" (UniqueName: \"kubernetes.io/projected/cf209962-051d-4ac0-9cc9-ba55de9e97cc-kube-api-access-zdnxs\") pod \"auto-csr-approver-29550798-vnc97\" (UID: \"cf209962-051d-4ac0-9cc9-ba55de9e97cc\") " pod="openshift-infra/auto-csr-approver-29550798-vnc97" Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.448432 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdnxs\" (UniqueName: \"kubernetes.io/projected/cf209962-051d-4ac0-9cc9-ba55de9e97cc-kube-api-access-zdnxs\") pod \"auto-csr-approver-29550798-vnc97\" (UID: \"cf209962-051d-4ac0-9cc9-ba55de9e97cc\") " pod="openshift-infra/auto-csr-approver-29550798-vnc97" Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.483616 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdnxs\" (UniqueName: \"kubernetes.io/projected/cf209962-051d-4ac0-9cc9-ba55de9e97cc-kube-api-access-zdnxs\") pod \"auto-csr-approver-29550798-vnc97\" (UID: \"cf209962-051d-4ac0-9cc9-ba55de9e97cc\") " pod="openshift-infra/auto-csr-approver-29550798-vnc97" Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.767714 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550798-vnc97" Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.970056 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550798-vnc97"] Mar 09 09:18:00 crc kubenswrapper[4792]: I0309 09:18:00.989332 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550798-vnc97" event={"ID":"cf209962-051d-4ac0-9cc9-ba55de9e97cc","Type":"ContainerStarted","Data":"b27502218fce99b6cf532260e8bc5f6fb64b93a06cba1ad4e10394500cd5153c"} Mar 09 09:18:03 crc kubenswrapper[4792]: I0309 09:18:03.001390 4792 generic.go:334] "Generic (PLEG): container finished" podID="cf209962-051d-4ac0-9cc9-ba55de9e97cc" containerID="0e6ec5fc1a8e9497d34c0082dc535ea6ffdfa063b37c982cb2bec605f6c718fb" exitCode=0 Mar 09 09:18:03 crc kubenswrapper[4792]: I0309 09:18:03.001486 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550798-vnc97" event={"ID":"cf209962-051d-4ac0-9cc9-ba55de9e97cc","Type":"ContainerDied","Data":"0e6ec5fc1a8e9497d34c0082dc535ea6ffdfa063b37c982cb2bec605f6c718fb"} Mar 09 09:18:04 crc kubenswrapper[4792]: I0309 09:18:04.265738 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550798-vnc97" Mar 09 09:18:04 crc kubenswrapper[4792]: I0309 09:18:04.451256 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdnxs\" (UniqueName: \"kubernetes.io/projected/cf209962-051d-4ac0-9cc9-ba55de9e97cc-kube-api-access-zdnxs\") pod \"cf209962-051d-4ac0-9cc9-ba55de9e97cc\" (UID: \"cf209962-051d-4ac0-9cc9-ba55de9e97cc\") " Mar 09 09:18:04 crc kubenswrapper[4792]: I0309 09:18:04.458000 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf209962-051d-4ac0-9cc9-ba55de9e97cc-kube-api-access-zdnxs" (OuterVolumeSpecName: "kube-api-access-zdnxs") pod "cf209962-051d-4ac0-9cc9-ba55de9e97cc" (UID: "cf209962-051d-4ac0-9cc9-ba55de9e97cc"). InnerVolumeSpecName "kube-api-access-zdnxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:18:04 crc kubenswrapper[4792]: I0309 09:18:04.552488 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdnxs\" (UniqueName: \"kubernetes.io/projected/cf209962-051d-4ac0-9cc9-ba55de9e97cc-kube-api-access-zdnxs\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:05 crc kubenswrapper[4792]: I0309 09:18:05.014764 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550798-vnc97" event={"ID":"cf209962-051d-4ac0-9cc9-ba55de9e97cc","Type":"ContainerDied","Data":"b27502218fce99b6cf532260e8bc5f6fb64b93a06cba1ad4e10394500cd5153c"} Mar 09 09:18:05 crc kubenswrapper[4792]: I0309 09:18:05.015214 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b27502218fce99b6cf532260e8bc5f6fb64b93a06cba1ad4e10394500cd5153c" Mar 09 09:18:05 crc kubenswrapper[4792]: I0309 09:18:05.014846 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550798-vnc97" Mar 09 09:18:05 crc kubenswrapper[4792]: I0309 09:18:05.343439 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550792-cq6k6"] Mar 09 09:18:05 crc kubenswrapper[4792]: I0309 09:18:05.347576 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550792-cq6k6"] Mar 09 09:18:05 crc kubenswrapper[4792]: I0309 09:18:05.669921 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fc195fc-d546-4720-9790-ddadfb09282b" path="/var/lib/kubelet/pods/5fc195fc-d546-4720-9790-ddadfb09282b/volumes" Mar 09 09:18:13 crc kubenswrapper[4792]: I0309 09:18:13.214653 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:18:13 crc kubenswrapper[4792]: I0309 09:18:13.215415 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:18:13 crc kubenswrapper[4792]: I0309 09:18:13.215467 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:18:13 crc kubenswrapper[4792]: I0309 09:18:13.216141 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6e17fbe8c1658cdfd2034c0251ad7b1ba86ff152d37adcf43ef52a7d2b0de4db"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:18:13 crc kubenswrapper[4792]: I0309 09:18:13.216200 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://6e17fbe8c1658cdfd2034c0251ad7b1ba86ff152d37adcf43ef52a7d2b0de4db" gracePeriod=600 Mar 09 09:18:14 crc kubenswrapper[4792]: I0309 09:18:14.067757 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="6e17fbe8c1658cdfd2034c0251ad7b1ba86ff152d37adcf43ef52a7d2b0de4db" exitCode=0 Mar 09 09:18:14 crc kubenswrapper[4792]: I0309 09:18:14.067826 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"6e17fbe8c1658cdfd2034c0251ad7b1ba86ff152d37adcf43ef52a7d2b0de4db"} Mar 09 09:18:14 crc kubenswrapper[4792]: I0309 09:18:14.068246 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"7ebe6e06d3acdb8dc390125e4fb4991f20773eff67765cf4ee7e42fe0e4d4167"} Mar 09 09:18:14 crc kubenswrapper[4792]: I0309 09:18:14.068286 4792 scope.go:117] "RemoveContainer" containerID="72d2b48cb291f76d23e0eb58c24e11749ce55457c1a4bd60f9c4519440ee7870" Mar 09 09:18:26 crc kubenswrapper[4792]: I0309 09:18:26.674853 4792 scope.go:117] "RemoveContainer" containerID="5f3d93339ac5aafeedc73bb985022ff918ac4be8b6414b466117da7a298b837b" Mar 09 09:18:26 crc kubenswrapper[4792]: I0309 09:18:26.713411 4792 scope.go:117] "RemoveContainer" containerID="ab562f4fb50dcd9d9211681341a256c3d39b392c0b54c9c2086d028f18d90ba4" Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.904669 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-xfgbm"] Mar 09 09:19:45 crc kubenswrapper[4792]: E0309 09:19:45.905593 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf209962-051d-4ac0-9cc9-ba55de9e97cc" containerName="oc" Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.905613 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf209962-051d-4ac0-9cc9-ba55de9e97cc" containerName="oc" Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.905757 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf209962-051d-4ac0-9cc9-ba55de9e97cc" containerName="oc" Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.906297 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xfgbm" Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.912977 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.918515 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-72tj7"] Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.919503 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-72tj7" Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.924652 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.924837 4792 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-dc7m7" Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.925222 4792 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-55s2v" Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.938195 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-72tj7"] Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.941148 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-xfgbm"] Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.954651 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-7cft4"] Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.955419 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-7cft4" Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.959392 4792 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-s4t5n" Mar 09 09:19:45 crc kubenswrapper[4792]: I0309 09:19:45.972166 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-7cft4"] Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.066215 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lmnl\" (UniqueName: \"kubernetes.io/projected/68e071be-fad9-4996-a83f-cd58058fe0f3-kube-api-access-4lmnl\") pod \"cert-manager-webhook-687f57d79b-7cft4\" (UID: \"68e071be-fad9-4996-a83f-cd58058fe0f3\") " pod="cert-manager/cert-manager-webhook-687f57d79b-7cft4" Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.066300 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgx7n\" (UniqueName: \"kubernetes.io/projected/a6ad9459-0185-47b2-aebd-5a5a40554946-kube-api-access-mgx7n\") pod \"cert-manager-858654f9db-72tj7\" (UID: \"a6ad9459-0185-47b2-aebd-5a5a40554946\") " pod="cert-manager/cert-manager-858654f9db-72tj7" Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.066322 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpghn\" (UniqueName: \"kubernetes.io/projected/8c00ef29-8d91-4045-982c-8b4a6e98576b-kube-api-access-rpghn\") pod \"cert-manager-cainjector-cf98fcc89-xfgbm\" (UID: \"8c00ef29-8d91-4045-982c-8b4a6e98576b\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-xfgbm" Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.167838 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lmnl\" (UniqueName: \"kubernetes.io/projected/68e071be-fad9-4996-a83f-cd58058fe0f3-kube-api-access-4lmnl\") pod \"cert-manager-webhook-687f57d79b-7cft4\" (UID: \"68e071be-fad9-4996-a83f-cd58058fe0f3\") " pod="cert-manager/cert-manager-webhook-687f57d79b-7cft4" Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.167954 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgx7n\" (UniqueName: \"kubernetes.io/projected/a6ad9459-0185-47b2-aebd-5a5a40554946-kube-api-access-mgx7n\") pod \"cert-manager-858654f9db-72tj7\" (UID: \"a6ad9459-0185-47b2-aebd-5a5a40554946\") " pod="cert-manager/cert-manager-858654f9db-72tj7" Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.167981 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpghn\" (UniqueName: \"kubernetes.io/projected/8c00ef29-8d91-4045-982c-8b4a6e98576b-kube-api-access-rpghn\") pod \"cert-manager-cainjector-cf98fcc89-xfgbm\" (UID: \"8c00ef29-8d91-4045-982c-8b4a6e98576b\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-xfgbm" Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.186891 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lmnl\" (UniqueName: \"kubernetes.io/projected/68e071be-fad9-4996-a83f-cd58058fe0f3-kube-api-access-4lmnl\") pod \"cert-manager-webhook-687f57d79b-7cft4\" (UID: \"68e071be-fad9-4996-a83f-cd58058fe0f3\") " pod="cert-manager/cert-manager-webhook-687f57d79b-7cft4" Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.186891 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgx7n\" (UniqueName: \"kubernetes.io/projected/a6ad9459-0185-47b2-aebd-5a5a40554946-kube-api-access-mgx7n\") pod \"cert-manager-858654f9db-72tj7\" (UID: \"a6ad9459-0185-47b2-aebd-5a5a40554946\") " pod="cert-manager/cert-manager-858654f9db-72tj7" Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.196551 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpghn\" (UniqueName: \"kubernetes.io/projected/8c00ef29-8d91-4045-982c-8b4a6e98576b-kube-api-access-rpghn\") pod \"cert-manager-cainjector-cf98fcc89-xfgbm\" (UID: \"8c00ef29-8d91-4045-982c-8b4a6e98576b\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-xfgbm" Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.225437 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xfgbm" Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.237165 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-72tj7" Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.281855 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-7cft4" Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.457548 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-72tj7"] Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.530999 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-xfgbm"] Mar 09 09:19:46 crc kubenswrapper[4792]: W0309 09:19:46.536261 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c00ef29_8d91_4045_982c_8b4a6e98576b.slice/crio-6dc9e9c7ca887a4f442a588fdeae9904b72a92a0ea691d5a05b80d8ad69cbd6f WatchSource:0}: Error finding container 6dc9e9c7ca887a4f442a588fdeae9904b72a92a0ea691d5a05b80d8ad69cbd6f: Status 404 returned error can't find the container with id 6dc9e9c7ca887a4f442a588fdeae9904b72a92a0ea691d5a05b80d8ad69cbd6f Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.628278 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xfgbm" event={"ID":"8c00ef29-8d91-4045-982c-8b4a6e98576b","Type":"ContainerStarted","Data":"6dc9e9c7ca887a4f442a588fdeae9904b72a92a0ea691d5a05b80d8ad69cbd6f"} Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.629475 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-72tj7" event={"ID":"a6ad9459-0185-47b2-aebd-5a5a40554946","Type":"ContainerStarted","Data":"2faa47ffc093c1a83ca09a8d7c888cd438e4a51c37ce76c0453c6d413b33f956"} Mar 09 09:19:46 crc kubenswrapper[4792]: I0309 09:19:46.756666 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-7cft4"] Mar 09 09:19:46 crc kubenswrapper[4792]: W0309 09:19:46.761913 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68e071be_fad9_4996_a83f_cd58058fe0f3.slice/crio-f0429149d35a03b6fc0f94389ab0ce403146a8181236f8117f79b8380598fd70 WatchSource:0}: Error finding container f0429149d35a03b6fc0f94389ab0ce403146a8181236f8117f79b8380598fd70: Status 404 returned error can't find the container with id f0429149d35a03b6fc0f94389ab0ce403146a8181236f8117f79b8380598fd70 Mar 09 09:19:47 crc kubenswrapper[4792]: I0309 09:19:47.638999 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-7cft4" event={"ID":"68e071be-fad9-4996-a83f-cd58058fe0f3","Type":"ContainerStarted","Data":"f0429149d35a03b6fc0f94389ab0ce403146a8181236f8117f79b8380598fd70"} Mar 09 09:19:50 crc kubenswrapper[4792]: I0309 09:19:50.673912 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-7cft4" event={"ID":"68e071be-fad9-4996-a83f-cd58058fe0f3","Type":"ContainerStarted","Data":"b0c47fa58189485b7ed535f7657ad96588c6ed75d7ba5ebae777ebafd74646fb"} Mar 09 09:19:50 crc kubenswrapper[4792]: I0309 09:19:50.674053 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-7cft4" Mar 09 09:19:50 crc kubenswrapper[4792]: I0309 09:19:50.675677 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xfgbm" event={"ID":"8c00ef29-8d91-4045-982c-8b4a6e98576b","Type":"ContainerStarted","Data":"581b693a717604a4d4ec57a18d977e683d5c21f09b8061d5e7a1cfa4a5b6acc9"} Mar 09 09:19:50 crc kubenswrapper[4792]: I0309 09:19:50.677789 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-72tj7" event={"ID":"a6ad9459-0185-47b2-aebd-5a5a40554946","Type":"ContainerStarted","Data":"eeaaa84cd2d7b3e756133b10d43bfe0139b85a01ca095e2c6c4f4b82a63f41da"} Mar 09 09:19:50 crc kubenswrapper[4792]: I0309 09:19:50.717268 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-72tj7" podStartSLOduration=2.232900419 podStartE2EDuration="5.717249839s" podCreationTimestamp="2026-03-09 09:19:45 +0000 UTC" firstStartedPulling="2026-03-09 09:19:46.47021843 +0000 UTC m=+751.500419172" lastFinishedPulling="2026-03-09 09:19:49.95456784 +0000 UTC m=+754.984768592" observedRunningTime="2026-03-09 09:19:50.713266061 +0000 UTC m=+755.743466853" watchObservedRunningTime="2026-03-09 09:19:50.717249839 +0000 UTC m=+755.747450591" Mar 09 09:19:50 crc kubenswrapper[4792]: I0309 09:19:50.717624 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-7cft4" podStartSLOduration=2.440767537 podStartE2EDuration="5.717619319s" podCreationTimestamp="2026-03-09 09:19:45 +0000 UTC" firstStartedPulling="2026-03-09 09:19:46.764374626 +0000 UTC m=+751.794575368" lastFinishedPulling="2026-03-09 09:19:50.041226398 +0000 UTC m=+755.071427150" observedRunningTime="2026-03-09 09:19:50.696519304 +0000 UTC m=+755.726720056" watchObservedRunningTime="2026-03-09 09:19:50.717619319 +0000 UTC m=+755.747820071" Mar 09 09:19:50 crc kubenswrapper[4792]: I0309 09:19:50.752566 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-xfgbm" podStartSLOduration=2.451482714 podStartE2EDuration="5.752547794s" podCreationTimestamp="2026-03-09 09:19:45 +0000 UTC" firstStartedPulling="2026-03-09 09:19:46.538976077 +0000 UTC m=+751.569176829" lastFinishedPulling="2026-03-09 09:19:49.840041157 +0000 UTC m=+754.870241909" observedRunningTime="2026-03-09 09:19:50.750208385 +0000 UTC m=+755.780409157" watchObservedRunningTime="2026-03-09 09:19:50.752547794 +0000 UTC m=+755.782748546" Mar 09 09:19:55 crc kubenswrapper[4792]: I0309 09:19:55.845238 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lfm2j"] Mar 09 09:19:55 crc kubenswrapper[4792]: I0309 09:19:55.845994 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovn-controller" containerID="cri-o://5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265" gracePeriod=30 Mar 09 09:19:55 crc kubenswrapper[4792]: I0309 09:19:55.846284 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="northd" containerID="cri-o://8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6" gracePeriod=30 Mar 09 09:19:55 crc kubenswrapper[4792]: I0309 09:19:55.846374 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af" gracePeriod=30 Mar 09 09:19:55 crc kubenswrapper[4792]: I0309 09:19:55.846414 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="sbdb" containerID="cri-o://72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f" gracePeriod=30 Mar 09 09:19:55 crc kubenswrapper[4792]: I0309 09:19:55.846431 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="kube-rbac-proxy-node" containerID="cri-o://d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9" gracePeriod=30 Mar 09 09:19:55 crc kubenswrapper[4792]: I0309 09:19:55.846471 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="nbdb" containerID="cri-o://4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc" gracePeriod=30 Mar 09 09:19:55 crc kubenswrapper[4792]: I0309 09:19:55.846484 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovn-acl-logging" containerID="cri-o://edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290" gracePeriod=30 Mar 09 09:19:55 crc kubenswrapper[4792]: I0309 09:19:55.883732 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" containerID="cri-o://1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc" gracePeriod=30 Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.186723 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/3.log" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.189692 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovn-acl-logging/0.log" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.190194 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovn-controller/0.log" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.190757 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.264933 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mbdpz"] Mar 09 09:19:56 crc kubenswrapper[4792]: E0309 09:19:56.265163 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="northd" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265178 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="northd" Mar 09 09:19:56 crc kubenswrapper[4792]: E0309 09:19:56.265187 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265194 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: E0309 09:19:56.265203 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovn-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265210 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovn-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: E0309 09:19:56.265218 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovn-acl-logging" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265225 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovn-acl-logging" Mar 09 09:19:56 crc kubenswrapper[4792]: E0309 09:19:56.265234 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="sbdb" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265240 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="sbdb" Mar 09 09:19:56 crc kubenswrapper[4792]: E0309 09:19:56.265247 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265252 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 09:19:56 crc kubenswrapper[4792]: E0309 09:19:56.265261 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265267 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: E0309 09:19:56.265274 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265280 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: E0309 09:19:56.265287 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="nbdb" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265293 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="nbdb" Mar 09 09:19:56 crc kubenswrapper[4792]: E0309 09:19:56.265302 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265307 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: E0309 09:19:56.265314 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="kubecfg-setup" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265320 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="kubecfg-setup" Mar 09 09:19:56 crc kubenswrapper[4792]: E0309 09:19:56.265336 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="kube-rbac-proxy-node" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265342 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="kube-rbac-proxy-node" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265432 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265440 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265447 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="nbdb" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265455 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovn-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265465 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265472 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovn-acl-logging" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265480 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265487 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="northd" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265496 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="sbdb" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265505 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="kube-rbac-proxy-node" Mar 09 09:19:56 crc kubenswrapper[4792]: E0309 09:19:56.265586 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265595 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265679 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.265838 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerName="ovnkube-controller" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.267160 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.302284 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-7cft4" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310588 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-systemd\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310632 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmxpq\" (UniqueName: \"kubernetes.io/projected/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-kube-api-access-gmxpq\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310667 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-log-socket\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310688 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-systemd-units\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310714 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-env-overrides\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310730 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-openvswitch\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310745 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-kubelet\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310759 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-run-netns\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310774 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-cni-netd\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310787 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-var-lib-openvswitch\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310830 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-node-log\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310852 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovnkube-config\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310871 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovn-node-metrics-cert\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310888 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-ovn\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310913 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovnkube-script-lib\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310934 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-etc-openvswitch\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310946 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-cni-bin\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310964 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310984 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-slash\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.310999 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-run-ovn-kubernetes\") pod \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\" (UID: \"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5\") " Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.311237 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.311981 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312008 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312027 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312089 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312453 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-log-socket" (OuterVolumeSpecName: "log-socket") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312496 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312526 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-node-log" (OuterVolumeSpecName: "node-log") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312500 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312549 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312565 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312557 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312588 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312588 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312574 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312610 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-slash" (OuterVolumeSpecName: "host-slash") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.312693 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.323635 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-kube-api-access-gmxpq" (OuterVolumeSpecName: "kube-api-access-gmxpq") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "kube-api-access-gmxpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.327492 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.333536 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" (UID: "740550e5-d1a4-4f0c-8efd-1ccd8f9319e5"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.411869 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-run-systemd\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.411928 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-run-ovn\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.411985 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-etc-openvswitch\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412007 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/49fb6cf3-9480-488f-90c5-07970635e9e1-ovnkube-script-lib\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412024 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-node-log\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412100 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-cni-bin\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412117 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-slash\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412149 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-log-socket\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412162 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-run-ovn-kubernetes\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412180 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/49fb6cf3-9480-488f-90c5-07970635e9e1-env-overrides\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412222 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-kubelet\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412246 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-run-openvswitch\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412287 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/49fb6cf3-9480-488f-90c5-07970635e9e1-ovnkube-config\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412303 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc547\" (UniqueName: \"kubernetes.io/projected/49fb6cf3-9480-488f-90c5-07970635e9e1-kube-api-access-zc547\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412348 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412405 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/49fb6cf3-9480-488f-90c5-07970635e9e1-ovn-node-metrics-cert\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412494 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-run-netns\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412528 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-var-lib-openvswitch\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412562 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-systemd-units\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412596 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-cni-netd\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412698 4792 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412713 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmxpq\" (UniqueName: \"kubernetes.io/projected/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-kube-api-access-gmxpq\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412724 4792 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-log-socket\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412733 4792 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412742 4792 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412750 4792 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412758 4792 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412765 4792 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412774 4792 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412781 4792 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412790 4792 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-node-log\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412800 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412808 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412817 4792 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412825 4792 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412833 4792 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412842 4792 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412850 4792 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412859 4792 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.412867 4792 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5-host-slash\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513492 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-systemd-units\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513538 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-cni-netd\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513604 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-systemd-units\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513620 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-run-systemd\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513604 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-cni-netd\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513659 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-run-ovn\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513683 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-run-systemd\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513694 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/49fb6cf3-9480-488f-90c5-07970635e9e1-ovnkube-script-lib\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513722 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-etc-openvswitch\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513740 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-node-log\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513759 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-run-ovn\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513772 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-etc-openvswitch\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513779 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-cni-bin\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513761 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-cni-bin\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513813 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-slash\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513829 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-log-socket\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513843 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-run-ovn-kubernetes\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513859 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/49fb6cf3-9480-488f-90c5-07970635e9e1-env-overrides\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513873 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-kubelet\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513887 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-run-openvswitch\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513912 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/49fb6cf3-9480-488f-90c5-07970635e9e1-ovnkube-config\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513927 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc547\" (UniqueName: \"kubernetes.io/projected/49fb6cf3-9480-488f-90c5-07970635e9e1-kube-api-access-zc547\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513942 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513958 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/49fb6cf3-9480-488f-90c5-07970635e9e1-ovn-node-metrics-cert\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513974 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-run-netns\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513992 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-var-lib-openvswitch\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.514040 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-var-lib-openvswitch\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.513786 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-node-log\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.514079 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-slash\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.514099 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-log-socket\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.514117 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-run-ovn-kubernetes\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.514635 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/49fb6cf3-9480-488f-90c5-07970635e9e1-env-overrides\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.514668 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-kubelet\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.514706 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-run-openvswitch\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.515019 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/49fb6cf3-9480-488f-90c5-07970635e9e1-ovnkube-script-lib\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.515141 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.515207 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49fb6cf3-9480-488f-90c5-07970635e9e1-host-run-netns\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.515269 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/49fb6cf3-9480-488f-90c5-07970635e9e1-ovnkube-config\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.517915 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/49fb6cf3-9480-488f-90c5-07970635e9e1-ovn-node-metrics-cert\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.537975 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc547\" (UniqueName: \"kubernetes.io/projected/49fb6cf3-9480-488f-90c5-07970635e9e1-kube-api-access-zc547\") pod \"ovnkube-node-mbdpz\" (UID: \"49fb6cf3-9480-488f-90c5-07970635e9e1\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.582392 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.714051 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgtc9_626ea896-2e5c-4478-a7be-34a19acc242d/kube-multus/1.log" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.714830 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgtc9_626ea896-2e5c-4478-a7be-34a19acc242d/kube-multus/0.log" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.714869 4792 generic.go:334] "Generic (PLEG): container finished" podID="626ea896-2e5c-4478-a7be-34a19acc242d" containerID="1062e61dca9fb971dffc9cd101c8b11ac94fc421dad88dc86f8e9df3fa2c93c4" exitCode=2 Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.714923 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgtc9" event={"ID":"626ea896-2e5c-4478-a7be-34a19acc242d","Type":"ContainerDied","Data":"1062e61dca9fb971dffc9cd101c8b11ac94fc421dad88dc86f8e9df3fa2c93c4"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.714956 4792 scope.go:117] "RemoveContainer" containerID="57f7829120f56a8ab9ff342c3d9fd043ca5559518f8d818c0306764f491f4b3a" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.716304 4792 scope.go:117] "RemoveContainer" containerID="1062e61dca9fb971dffc9cd101c8b11ac94fc421dad88dc86f8e9df3fa2c93c4" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.718393 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" event={"ID":"49fb6cf3-9480-488f-90c5-07970635e9e1","Type":"ContainerStarted","Data":"0466776d05f8298166134ff778cf7550a6a5085b0d621614183cad717087499c"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.718449 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" event={"ID":"49fb6cf3-9480-488f-90c5-07970635e9e1","Type":"ContainerStarted","Data":"338c11b6d6f83a652d84441148ef8cbb6617b1bb67838000d6505d63f5cbb680"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.721989 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovnkube-controller/3.log" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.735919 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovn-acl-logging/0.log" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.736591 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lfm2j_740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/ovn-controller/0.log" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738326 4792 generic.go:334] "Generic (PLEG): container finished" podID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerID="1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc" exitCode=0 Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738357 4792 generic.go:334] "Generic (PLEG): container finished" podID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerID="72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f" exitCode=0 Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738368 4792 generic.go:334] "Generic (PLEG): container finished" podID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerID="4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc" exitCode=0 Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738379 4792 generic.go:334] "Generic (PLEG): container finished" podID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerID="8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6" exitCode=0 Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738387 4792 generic.go:334] "Generic (PLEG): container finished" podID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerID="9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af" exitCode=0 Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738398 4792 generic.go:334] "Generic (PLEG): container finished" podID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerID="d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9" exitCode=0 Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738525 4792 generic.go:334] "Generic (PLEG): container finished" podID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerID="edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290" exitCode=143 Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738540 4792 generic.go:334] "Generic (PLEG): container finished" podID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" containerID="5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265" exitCode=143 Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738566 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738597 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738611 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738624 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738634 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738645 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738658 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738668 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738700 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738707 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738713 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738720 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738726 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738735 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738741 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738766 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.738768 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739602 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739625 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739636 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739643 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739650 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739657 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739664 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739670 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739677 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739684 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739690 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739699 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739709 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739716 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739722 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739727 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739733 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739739 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739745 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739751 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739756 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739763 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739771 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lfm2j" event={"ID":"740550e5-d1a4-4f0c-8efd-1ccd8f9319e5","Type":"ContainerDied","Data":"a5c2dc4c4f3013e1c887f80aebacfbc5517d8d00bc6fda475aa52ec17b3d5b7b"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739779 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739787 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739793 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739799 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739806 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739813 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739819 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739825 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739831 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.739837 4792 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822"} Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.784425 4792 scope.go:117] "RemoveContainer" containerID="1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.791341 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lfm2j"] Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.795623 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lfm2j"] Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.830517 4792 scope.go:117] "RemoveContainer" containerID="1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.847187 4792 scope.go:117] "RemoveContainer" containerID="72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.866590 4792 scope.go:117] "RemoveContainer" containerID="4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.882401 4792 scope.go:117] "RemoveContainer" containerID="8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.896699 4792 scope.go:117] "RemoveContainer" containerID="9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.912243 4792 scope.go:117] "RemoveContainer" containerID="d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.941734 4792 scope.go:117] "RemoveContainer" containerID="edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290" Mar 09 09:19:56 crc kubenswrapper[4792]: I0309 09:19:56.953855 4792 scope.go:117] "RemoveContainer" containerID="5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.009402 4792 scope.go:117] "RemoveContainer" containerID="9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.027740 4792 scope.go:117] "RemoveContainer" containerID="1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc" Mar 09 09:19:57 crc kubenswrapper[4792]: E0309 09:19:57.028453 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc\": container with ID starting with 1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc not found: ID does not exist" containerID="1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.028492 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc"} err="failed to get container status \"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc\": rpc error: code = NotFound desc = could not find container \"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc\": container with ID starting with 1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.028519 4792 scope.go:117] "RemoveContainer" containerID="1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf" Mar 09 09:19:57 crc kubenswrapper[4792]: E0309 09:19:57.028882 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\": container with ID starting with 1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf not found: ID does not exist" containerID="1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.028910 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf"} err="failed to get container status \"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\": rpc error: code = NotFound desc = could not find container \"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\": container with ID starting with 1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.028927 4792 scope.go:117] "RemoveContainer" containerID="72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f" Mar 09 09:19:57 crc kubenswrapper[4792]: E0309 09:19:57.029266 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\": container with ID starting with 72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f not found: ID does not exist" containerID="72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.029291 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f"} err="failed to get container status \"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\": rpc error: code = NotFound desc = could not find container \"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\": container with ID starting with 72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.029310 4792 scope.go:117] "RemoveContainer" containerID="4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc" Mar 09 09:19:57 crc kubenswrapper[4792]: E0309 09:19:57.029870 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\": container with ID starting with 4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc not found: ID does not exist" containerID="4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.029899 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc"} err="failed to get container status \"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\": rpc error: code = NotFound desc = could not find container \"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\": container with ID starting with 4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.029916 4792 scope.go:117] "RemoveContainer" containerID="8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6" Mar 09 09:19:57 crc kubenswrapper[4792]: E0309 09:19:57.030282 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\": container with ID starting with 8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6 not found: ID does not exist" containerID="8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.030307 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6"} err="failed to get container status \"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\": rpc error: code = NotFound desc = could not find container \"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\": container with ID starting with 8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.030323 4792 scope.go:117] "RemoveContainer" containerID="9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af" Mar 09 09:19:57 crc kubenswrapper[4792]: E0309 09:19:57.030615 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\": container with ID starting with 9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af not found: ID does not exist" containerID="9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.030636 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af"} err="failed to get container status \"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\": rpc error: code = NotFound desc = could not find container \"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\": container with ID starting with 9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.030653 4792 scope.go:117] "RemoveContainer" containerID="d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9" Mar 09 09:19:57 crc kubenswrapper[4792]: E0309 09:19:57.030914 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\": container with ID starting with d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9 not found: ID does not exist" containerID="d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.030941 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9"} err="failed to get container status \"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\": rpc error: code = NotFound desc = could not find container \"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\": container with ID starting with d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.030957 4792 scope.go:117] "RemoveContainer" containerID="edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290" Mar 09 09:19:57 crc kubenswrapper[4792]: E0309 09:19:57.031306 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\": container with ID starting with edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290 not found: ID does not exist" containerID="edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.031333 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290"} err="failed to get container status \"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\": rpc error: code = NotFound desc = could not find container \"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\": container with ID starting with edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.031349 4792 scope.go:117] "RemoveContainer" containerID="5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265" Mar 09 09:19:57 crc kubenswrapper[4792]: E0309 09:19:57.031590 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\": container with ID starting with 5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265 not found: ID does not exist" containerID="5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.031616 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265"} err="failed to get container status \"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\": rpc error: code = NotFound desc = could not find container \"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\": container with ID starting with 5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.031633 4792 scope.go:117] "RemoveContainer" containerID="9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822" Mar 09 09:19:57 crc kubenswrapper[4792]: E0309 09:19:57.031883 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\": container with ID starting with 9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822 not found: ID does not exist" containerID="9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.031904 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822"} err="failed to get container status \"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\": rpc error: code = NotFound desc = could not find container \"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\": container with ID starting with 9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.031921 4792 scope.go:117] "RemoveContainer" containerID="1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.032221 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc"} err="failed to get container status \"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc\": rpc error: code = NotFound desc = could not find container \"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc\": container with ID starting with 1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.032285 4792 scope.go:117] "RemoveContainer" containerID="1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.032552 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf"} err="failed to get container status \"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\": rpc error: code = NotFound desc = could not find container \"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\": container with ID starting with 1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.032577 4792 scope.go:117] "RemoveContainer" containerID="72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.033228 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f"} err="failed to get container status \"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\": rpc error: code = NotFound desc = could not find container \"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\": container with ID starting with 72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.033278 4792 scope.go:117] "RemoveContainer" containerID="4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.033851 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc"} err="failed to get container status \"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\": rpc error: code = NotFound desc = could not find container \"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\": container with ID starting with 4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.033878 4792 scope.go:117] "RemoveContainer" containerID="8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.034301 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6"} err="failed to get container status \"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\": rpc error: code = NotFound desc = could not find container \"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\": container with ID starting with 8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.034329 4792 scope.go:117] "RemoveContainer" containerID="9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.034762 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af"} err="failed to get container status \"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\": rpc error: code = NotFound desc = could not find container \"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\": container with ID starting with 9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.034788 4792 scope.go:117] "RemoveContainer" containerID="d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.035122 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9"} err="failed to get container status \"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\": rpc error: code = NotFound desc = could not find container \"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\": container with ID starting with d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.035145 4792 scope.go:117] "RemoveContainer" containerID="edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.035599 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290"} err="failed to get container status \"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\": rpc error: code = NotFound desc = could not find container \"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\": container with ID starting with edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.035624 4792 scope.go:117] "RemoveContainer" containerID="5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.035995 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265"} err="failed to get container status \"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\": rpc error: code = NotFound desc = could not find container \"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\": container with ID starting with 5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.036021 4792 scope.go:117] "RemoveContainer" containerID="9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.036380 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822"} err="failed to get container status \"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\": rpc error: code = NotFound desc = could not find container \"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\": container with ID starting with 9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.036406 4792 scope.go:117] "RemoveContainer" containerID="1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.036741 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc"} err="failed to get container status \"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc\": rpc error: code = NotFound desc = could not find container \"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc\": container with ID starting with 1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.036768 4792 scope.go:117] "RemoveContainer" containerID="1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.037205 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf"} err="failed to get container status \"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\": rpc error: code = NotFound desc = could not find container \"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\": container with ID starting with 1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.037230 4792 scope.go:117] "RemoveContainer" containerID="72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.037548 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f"} err="failed to get container status \"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\": rpc error: code = NotFound desc = could not find container \"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\": container with ID starting with 72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.037571 4792 scope.go:117] "RemoveContainer" containerID="4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.037985 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc"} err="failed to get container status \"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\": rpc error: code = NotFound desc = could not find container \"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\": container with ID starting with 4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.038010 4792 scope.go:117] "RemoveContainer" containerID="8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.038341 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6"} err="failed to get container status \"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\": rpc error: code = NotFound desc = could not find container \"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\": container with ID starting with 8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.038366 4792 scope.go:117] "RemoveContainer" containerID="9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.038680 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af"} err="failed to get container status \"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\": rpc error: code = NotFound desc = could not find container \"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\": container with ID starting with 9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.038704 4792 scope.go:117] "RemoveContainer" containerID="d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.039125 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9"} err="failed to get container status \"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\": rpc error: code = NotFound desc = could not find container \"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\": container with ID starting with d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.039150 4792 scope.go:117] "RemoveContainer" containerID="edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.039492 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290"} err="failed to get container status \"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\": rpc error: code = NotFound desc = could not find container \"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\": container with ID starting with edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.039514 4792 scope.go:117] "RemoveContainer" containerID="5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.039827 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265"} err="failed to get container status \"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\": rpc error: code = NotFound desc = could not find container \"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\": container with ID starting with 5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.039848 4792 scope.go:117] "RemoveContainer" containerID="9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.040234 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822"} err="failed to get container status \"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\": rpc error: code = NotFound desc = could not find container \"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\": container with ID starting with 9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.040258 4792 scope.go:117] "RemoveContainer" containerID="1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.040571 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc"} err="failed to get container status \"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc\": rpc error: code = NotFound desc = could not find container \"1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc\": container with ID starting with 1a6a0b78db67ad47051e996cb6c2c0467bd3c4f3682498fa3ba5e6224b6319bc not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.040592 4792 scope.go:117] "RemoveContainer" containerID="1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.040918 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf"} err="failed to get container status \"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\": rpc error: code = NotFound desc = could not find container \"1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf\": container with ID starting with 1244a18a5a9df2128cb16421f7d44aba05bb92b6a91b26fc2845847a9b4c91bf not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.040939 4792 scope.go:117] "RemoveContainer" containerID="72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.041263 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f"} err="failed to get container status \"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\": rpc error: code = NotFound desc = could not find container \"72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f\": container with ID starting with 72a93106360fd23f597cf8fb963aca72a606f82557a5b17125a969e5b5d5918f not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.041286 4792 scope.go:117] "RemoveContainer" containerID="4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.041640 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc"} err="failed to get container status \"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\": rpc error: code = NotFound desc = could not find container \"4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc\": container with ID starting with 4c80e0ef9426b7a764e7117a789d07cf4cf940a90f38fe3ce6b230f9bbd21bfc not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.041664 4792 scope.go:117] "RemoveContainer" containerID="8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.041992 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6"} err="failed to get container status \"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\": rpc error: code = NotFound desc = could not find container \"8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6\": container with ID starting with 8f72b0194cacf6d5d0c95ba804286d822d2f2e5a0f385c4c5fdf8559bf6240c6 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.042015 4792 scope.go:117] "RemoveContainer" containerID="9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.042870 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af"} err="failed to get container status \"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\": rpc error: code = NotFound desc = could not find container \"9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af\": container with ID starting with 9e0f50edd29f0791cc076a3a2974b7456aaf2a96534da791b083248dc84fa6af not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.042897 4792 scope.go:117] "RemoveContainer" containerID="d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.043183 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9"} err="failed to get container status \"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\": rpc error: code = NotFound desc = could not find container \"d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9\": container with ID starting with d93911614f6785ac12751349f50ab00c0716955c72dc48866083013e172cf3c9 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.043208 4792 scope.go:117] "RemoveContainer" containerID="edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.043467 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290"} err="failed to get container status \"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\": rpc error: code = NotFound desc = could not find container \"edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290\": container with ID starting with edf00db622558a346d238b2df6e90a686dc913634a1b5b4e8b010b5bf09a7290 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.043495 4792 scope.go:117] "RemoveContainer" containerID="5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.043789 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265"} err="failed to get container status \"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\": rpc error: code = NotFound desc = could not find container \"5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265\": container with ID starting with 5fc091b21251a54d9eca892667bd681e944b35b6407316a8252562e837a1e265 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.043813 4792 scope.go:117] "RemoveContainer" containerID="9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.044058 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822"} err="failed to get container status \"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\": rpc error: code = NotFound desc = could not find container \"9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822\": container with ID starting with 9abf97193d88754a49bcc12e1cfbe83371778e157d028dd3d1084abfa8526822 not found: ID does not exist" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.668325 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740550e5-d1a4-4f0c-8efd-1ccd8f9319e5" path="/var/lib/kubelet/pods/740550e5-d1a4-4f0c-8efd-1ccd8f9319e5/volumes" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.745710 4792 generic.go:334] "Generic (PLEG): container finished" podID="49fb6cf3-9480-488f-90c5-07970635e9e1" containerID="0466776d05f8298166134ff778cf7550a6a5085b0d621614183cad717087499c" exitCode=0 Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.745799 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" event={"ID":"49fb6cf3-9480-488f-90c5-07970635e9e1","Type":"ContainerDied","Data":"0466776d05f8298166134ff778cf7550a6a5085b0d621614183cad717087499c"} Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.745861 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" event={"ID":"49fb6cf3-9480-488f-90c5-07970635e9e1","Type":"ContainerStarted","Data":"2822d3ab3613ebf1cff9823fa40f4a1ff2a56dfafa2a5d08e08852440a8e2aab"} Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.745875 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" event={"ID":"49fb6cf3-9480-488f-90c5-07970635e9e1","Type":"ContainerStarted","Data":"f8ba59141a1d8e760855ef03209a69fd458218529d8c1df1e4c1f3f0c78df7f3"} Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.745887 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" event={"ID":"49fb6cf3-9480-488f-90c5-07970635e9e1","Type":"ContainerStarted","Data":"7e3685251a5de4f762aef6ada2f730dc9c3e47f3f0e5af9203caaae2cacf2a4c"} Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.745898 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" event={"ID":"49fb6cf3-9480-488f-90c5-07970635e9e1","Type":"ContainerStarted","Data":"1cac964f16e2cfc5d4e054614a68e9ec68f5fb4c83cf571618575d88db705d22"} Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.745909 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" event={"ID":"49fb6cf3-9480-488f-90c5-07970635e9e1","Type":"ContainerStarted","Data":"f754691125ded75975019b6fa7f41ea406d551b72dd214162be15720f9060941"} Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.745923 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" event={"ID":"49fb6cf3-9480-488f-90c5-07970635e9e1","Type":"ContainerStarted","Data":"409d23419dc587f92ed65a4f6a64057e218bd0853fd9fa177d69441561207609"} Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.748865 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vgtc9_626ea896-2e5c-4478-a7be-34a19acc242d/kube-multus/1.log" Mar 09 09:19:57 crc kubenswrapper[4792]: I0309 09:19:57.748920 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vgtc9" event={"ID":"626ea896-2e5c-4478-a7be-34a19acc242d","Type":"ContainerStarted","Data":"275f54d45cdd96a5324208676921996e2ef2803c9fff804c96ddf7f195f92bac"} Mar 09 09:19:59 crc kubenswrapper[4792]: I0309 09:19:59.764959 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" event={"ID":"49fb6cf3-9480-488f-90c5-07970635e9e1","Type":"ContainerStarted","Data":"1f8e3b36fce988993787b6f5b1cb2cff479d9aa0a6dce3c29ffa27ea7588755f"} Mar 09 09:20:00 crc kubenswrapper[4792]: I0309 09:20:00.155748 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550800-99fr8"] Mar 09 09:20:00 crc kubenswrapper[4792]: I0309 09:20:00.156993 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:00 crc kubenswrapper[4792]: I0309 09:20:00.158664 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:20:00 crc kubenswrapper[4792]: I0309 09:20:00.159227 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:20:00 crc kubenswrapper[4792]: I0309 09:20:00.159314 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:20:00 crc kubenswrapper[4792]: I0309 09:20:00.274485 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbv7q\" (UniqueName: \"kubernetes.io/projected/1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d-kube-api-access-dbv7q\") pod \"auto-csr-approver-29550800-99fr8\" (UID: \"1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d\") " pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:00 crc kubenswrapper[4792]: I0309 09:20:00.376229 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbv7q\" (UniqueName: \"kubernetes.io/projected/1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d-kube-api-access-dbv7q\") pod \"auto-csr-approver-29550800-99fr8\" (UID: \"1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d\") " pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:00 crc kubenswrapper[4792]: I0309 09:20:00.396329 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbv7q\" (UniqueName: \"kubernetes.io/projected/1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d-kube-api-access-dbv7q\") pod \"auto-csr-approver-29550800-99fr8\" (UID: \"1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d\") " pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:00 crc kubenswrapper[4792]: I0309 09:20:00.475812 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:00 crc kubenswrapper[4792]: E0309 09:20:00.498901 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29550800-99fr8_openshift-infra_1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d_0(8b637649b852633c708d60ec015511a4a7f2b59c0d3272bc909b4328bc994ad5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:20:00 crc kubenswrapper[4792]: E0309 09:20:00.499123 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29550800-99fr8_openshift-infra_1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d_0(8b637649b852633c708d60ec015511a4a7f2b59c0d3272bc909b4328bc994ad5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:00 crc kubenswrapper[4792]: E0309 09:20:00.499223 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29550800-99fr8_openshift-infra_1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d_0(8b637649b852633c708d60ec015511a4a7f2b59c0d3272bc909b4328bc994ad5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:00 crc kubenswrapper[4792]: E0309 09:20:00.499353 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29550800-99fr8_openshift-infra(1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29550800-99fr8_openshift-infra(1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29550800-99fr8_openshift-infra_1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d_0(8b637649b852633c708d60ec015511a4a7f2b59c0d3272bc909b4328bc994ad5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29550800-99fr8" podUID="1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d" Mar 09 09:20:02 crc kubenswrapper[4792]: I0309 09:20:02.550348 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550800-99fr8"] Mar 09 09:20:02 crc kubenswrapper[4792]: I0309 09:20:02.551902 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:02 crc kubenswrapper[4792]: I0309 09:20:02.552960 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:02 crc kubenswrapper[4792]: E0309 09:20:02.580370 4792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29550800-99fr8_openshift-infra_1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d_0(606112bd34a28eaefeac39a6620f2f3240ac7f93cfd5d2cc446d378a340c242a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 09:20:02 crc kubenswrapper[4792]: E0309 09:20:02.580443 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29550800-99fr8_openshift-infra_1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d_0(606112bd34a28eaefeac39a6620f2f3240ac7f93cfd5d2cc446d378a340c242a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:02 crc kubenswrapper[4792]: E0309 09:20:02.580472 4792 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29550800-99fr8_openshift-infra_1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d_0(606112bd34a28eaefeac39a6620f2f3240ac7f93cfd5d2cc446d378a340c242a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:02 crc kubenswrapper[4792]: E0309 09:20:02.580543 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29550800-99fr8_openshift-infra(1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29550800-99fr8_openshift-infra(1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29550800-99fr8_openshift-infra_1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d_0(606112bd34a28eaefeac39a6620f2f3240ac7f93cfd5d2cc446d378a340c242a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29550800-99fr8" podUID="1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d" Mar 09 09:20:02 crc kubenswrapper[4792]: I0309 09:20:02.784745 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" event={"ID":"49fb6cf3-9480-488f-90c5-07970635e9e1","Type":"ContainerStarted","Data":"2f6c60f7fe9347a0752d8f199abf9880bbfa90ae3a29b9081bd4b21de86d465b"} Mar 09 09:20:02 crc kubenswrapper[4792]: I0309 09:20:02.785203 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:20:02 crc kubenswrapper[4792]: I0309 09:20:02.785352 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:20:02 crc kubenswrapper[4792]: I0309 09:20:02.817308 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:20:02 crc kubenswrapper[4792]: I0309 09:20:02.827879 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" podStartSLOduration=6.827857305 podStartE2EDuration="6.827857305s" podCreationTimestamp="2026-03-09 09:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:20:02.825567136 +0000 UTC m=+767.855767898" watchObservedRunningTime="2026-03-09 09:20:02.827857305 +0000 UTC m=+767.858058057" Mar 09 09:20:03 crc kubenswrapper[4792]: I0309 09:20:03.803011 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:20:03 crc kubenswrapper[4792]: I0309 09:20:03.897866 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:20:13 crc kubenswrapper[4792]: I0309 09:20:13.214785 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:20:13 crc kubenswrapper[4792]: I0309 09:20:13.215194 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:20:13 crc kubenswrapper[4792]: I0309 09:20:13.662000 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:13 crc kubenswrapper[4792]: I0309 09:20:13.662706 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:13 crc kubenswrapper[4792]: I0309 09:20:13.845427 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550800-99fr8"] Mar 09 09:20:13 crc kubenswrapper[4792]: W0309 09:20:13.855755 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1058fa4d_1b24_4f5a_b99b_0f4b2b6f041d.slice/crio-507a6590ca61fa22786afbca51749b6e78ab515619e0cc327ede7336863f0c90 WatchSource:0}: Error finding container 507a6590ca61fa22786afbca51749b6e78ab515619e0cc327ede7336863f0c90: Status 404 returned error can't find the container with id 507a6590ca61fa22786afbca51749b6e78ab515619e0cc327ede7336863f0c90 Mar 09 09:20:14 crc kubenswrapper[4792]: I0309 09:20:14.865233 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550800-99fr8" event={"ID":"1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d","Type":"ContainerStarted","Data":"507a6590ca61fa22786afbca51749b6e78ab515619e0cc327ede7336863f0c90"} Mar 09 09:20:15 crc kubenswrapper[4792]: I0309 09:20:15.872497 4792 generic.go:334] "Generic (PLEG): container finished" podID="1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d" containerID="f319a389f728f78767a6ee2c87b636f3af093985e69ff0e0fe771ff633407050" exitCode=0 Mar 09 09:20:15 crc kubenswrapper[4792]: I0309 09:20:15.872696 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550800-99fr8" event={"ID":"1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d","Type":"ContainerDied","Data":"f319a389f728f78767a6ee2c87b636f3af093985e69ff0e0fe771ff633407050"} Mar 09 09:20:17 crc kubenswrapper[4792]: I0309 09:20:17.093279 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:17 crc kubenswrapper[4792]: I0309 09:20:17.190404 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbv7q\" (UniqueName: \"kubernetes.io/projected/1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d-kube-api-access-dbv7q\") pod \"1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d\" (UID: \"1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d\") " Mar 09 09:20:17 crc kubenswrapper[4792]: I0309 09:20:17.198367 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d-kube-api-access-dbv7q" (OuterVolumeSpecName: "kube-api-access-dbv7q") pod "1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d" (UID: "1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d"). InnerVolumeSpecName "kube-api-access-dbv7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:20:17 crc kubenswrapper[4792]: I0309 09:20:17.292699 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbv7q\" (UniqueName: \"kubernetes.io/projected/1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d-kube-api-access-dbv7q\") on node \"crc\" DevicePath \"\"" Mar 09 09:20:17 crc kubenswrapper[4792]: I0309 09:20:17.886372 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550800-99fr8" event={"ID":"1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d","Type":"ContainerDied","Data":"507a6590ca61fa22786afbca51749b6e78ab515619e0cc327ede7336863f0c90"} Mar 09 09:20:17 crc kubenswrapper[4792]: I0309 09:20:17.886776 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="507a6590ca61fa22786afbca51749b6e78ab515619e0cc327ede7336863f0c90" Mar 09 09:20:17 crc kubenswrapper[4792]: I0309 09:20:17.886420 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550800-99fr8" Mar 09 09:20:18 crc kubenswrapper[4792]: I0309 09:20:18.172389 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550794-wqfmq"] Mar 09 09:20:18 crc kubenswrapper[4792]: I0309 09:20:18.179792 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550794-wqfmq"] Mar 09 09:20:19 crc kubenswrapper[4792]: I0309 09:20:19.671638 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b589961-be09-4888-9ee2-49bf55db091a" path="/var/lib/kubelet/pods/4b589961-be09-4888-9ee2-49bf55db091a/volumes" Mar 09 09:20:26 crc kubenswrapper[4792]: I0309 09:20:26.631838 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbdpz" Mar 09 09:20:26 crc kubenswrapper[4792]: I0309 09:20:26.782519 4792 scope.go:117] "RemoveContainer" containerID="f3ad56061bf5596cb1f3785983e5f3c56a9884125309ab977bf20b656b7ae980" Mar 09 09:20:34 crc kubenswrapper[4792]: I0309 09:20:34.907001 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92"] Mar 09 09:20:34 crc kubenswrapper[4792]: E0309 09:20:34.908129 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d" containerName="oc" Mar 09 09:20:34 crc kubenswrapper[4792]: I0309 09:20:34.908145 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d" containerName="oc" Mar 09 09:20:34 crc kubenswrapper[4792]: I0309 09:20:34.908266 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d" containerName="oc" Mar 09 09:20:34 crc kubenswrapper[4792]: I0309 09:20:34.909200 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" Mar 09 09:20:34 crc kubenswrapper[4792]: I0309 09:20:34.912500 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 09:20:34 crc kubenswrapper[4792]: I0309 09:20:34.922783 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92"] Mar 09 09:20:34 crc kubenswrapper[4792]: I0309 09:20:34.923674 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95e345ef-d076-4754-b2e9-db935995c8c0-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92\" (UID: \"95e345ef-d076-4754-b2e9-db935995c8c0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" Mar 09 09:20:34 crc kubenswrapper[4792]: I0309 09:20:34.923749 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95e345ef-d076-4754-b2e9-db935995c8c0-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92\" (UID: \"95e345ef-d076-4754-b2e9-db935995c8c0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" Mar 09 09:20:34 crc kubenswrapper[4792]: I0309 09:20:34.923785 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5j27\" (UniqueName: \"kubernetes.io/projected/95e345ef-d076-4754-b2e9-db935995c8c0-kube-api-access-m5j27\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92\" (UID: \"95e345ef-d076-4754-b2e9-db935995c8c0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" Mar 09 09:20:35 crc kubenswrapper[4792]: I0309 09:20:35.025487 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95e345ef-d076-4754-b2e9-db935995c8c0-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92\" (UID: \"95e345ef-d076-4754-b2e9-db935995c8c0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" Mar 09 09:20:35 crc kubenswrapper[4792]: I0309 09:20:35.025569 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5j27\" (UniqueName: \"kubernetes.io/projected/95e345ef-d076-4754-b2e9-db935995c8c0-kube-api-access-m5j27\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92\" (UID: \"95e345ef-d076-4754-b2e9-db935995c8c0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" Mar 09 09:20:35 crc kubenswrapper[4792]: I0309 09:20:35.025620 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95e345ef-d076-4754-b2e9-db935995c8c0-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92\" (UID: \"95e345ef-d076-4754-b2e9-db935995c8c0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" Mar 09 09:20:35 crc kubenswrapper[4792]: I0309 09:20:35.025989 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95e345ef-d076-4754-b2e9-db935995c8c0-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92\" (UID: \"95e345ef-d076-4754-b2e9-db935995c8c0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" Mar 09 09:20:35 crc kubenswrapper[4792]: I0309 09:20:35.026161 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95e345ef-d076-4754-b2e9-db935995c8c0-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92\" (UID: \"95e345ef-d076-4754-b2e9-db935995c8c0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" Mar 09 09:20:35 crc kubenswrapper[4792]: I0309 09:20:35.057817 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5j27\" (UniqueName: \"kubernetes.io/projected/95e345ef-d076-4754-b2e9-db935995c8c0-kube-api-access-m5j27\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92\" (UID: \"95e345ef-d076-4754-b2e9-db935995c8c0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" Mar 09 09:20:35 crc kubenswrapper[4792]: I0309 09:20:35.230812 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" Mar 09 09:20:36 crc kubenswrapper[4792]: I0309 09:20:35.448794 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92"] Mar 09 09:20:36 crc kubenswrapper[4792]: I0309 09:20:36.014696 4792 generic.go:334] "Generic (PLEG): container finished" podID="95e345ef-d076-4754-b2e9-db935995c8c0" containerID="e23893ff2ebf12398fc8c71ae7360e2d3ef2ee32e4469a966460e28e38c3336d" exitCode=0 Mar 09 09:20:36 crc kubenswrapper[4792]: I0309 09:20:36.014888 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" event={"ID":"95e345ef-d076-4754-b2e9-db935995c8c0","Type":"ContainerDied","Data":"e23893ff2ebf12398fc8c71ae7360e2d3ef2ee32e4469a966460e28e38c3336d"} Mar 09 09:20:36 crc kubenswrapper[4792]: I0309 09:20:36.015385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" event={"ID":"95e345ef-d076-4754-b2e9-db935995c8c0","Type":"ContainerStarted","Data":"745ef2c7b2675a8e68e08cb73662efecfc28c782803988a906b70c37553bc5c8"} Mar 09 09:20:40 crc kubenswrapper[4792]: I0309 09:20:40.047574 4792 generic.go:334] "Generic (PLEG): container finished" podID="95e345ef-d076-4754-b2e9-db935995c8c0" containerID="990e6be0f4cf7619fd612d7bc88d68571c8a7d5f7cc09e43ae99ca36635f5b0f" exitCode=0 Mar 09 09:20:40 crc kubenswrapper[4792]: I0309 09:20:40.047668 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" event={"ID":"95e345ef-d076-4754-b2e9-db935995c8c0","Type":"ContainerDied","Data":"990e6be0f4cf7619fd612d7bc88d68571c8a7d5f7cc09e43ae99ca36635f5b0f"} Mar 09 09:20:41 crc kubenswrapper[4792]: I0309 09:20:41.059273 4792 generic.go:334] "Generic (PLEG): container finished" podID="95e345ef-d076-4754-b2e9-db935995c8c0" containerID="ac51e93db9be1fb8f25740ea8f4b78e9d508f4fea118a91efd538504bfad28b6" exitCode=0 Mar 09 09:20:41 crc kubenswrapper[4792]: I0309 09:20:41.059368 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" event={"ID":"95e345ef-d076-4754-b2e9-db935995c8c0","Type":"ContainerDied","Data":"ac51e93db9be1fb8f25740ea8f4b78e9d508f4fea118a91efd538504bfad28b6"} Mar 09 09:20:42 crc kubenswrapper[4792]: I0309 09:20:42.300441 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" Mar 09 09:20:42 crc kubenswrapper[4792]: I0309 09:20:42.436260 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95e345ef-d076-4754-b2e9-db935995c8c0-util\") pod \"95e345ef-d076-4754-b2e9-db935995c8c0\" (UID: \"95e345ef-d076-4754-b2e9-db935995c8c0\") " Mar 09 09:20:42 crc kubenswrapper[4792]: I0309 09:20:42.436327 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95e345ef-d076-4754-b2e9-db935995c8c0-bundle\") pod \"95e345ef-d076-4754-b2e9-db935995c8c0\" (UID: \"95e345ef-d076-4754-b2e9-db935995c8c0\") " Mar 09 09:20:42 crc kubenswrapper[4792]: I0309 09:20:42.436377 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5j27\" (UniqueName: \"kubernetes.io/projected/95e345ef-d076-4754-b2e9-db935995c8c0-kube-api-access-m5j27\") pod \"95e345ef-d076-4754-b2e9-db935995c8c0\" (UID: \"95e345ef-d076-4754-b2e9-db935995c8c0\") " Mar 09 09:20:42 crc kubenswrapper[4792]: I0309 09:20:42.437120 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95e345ef-d076-4754-b2e9-db935995c8c0-bundle" (OuterVolumeSpecName: "bundle") pod "95e345ef-d076-4754-b2e9-db935995c8c0" (UID: "95e345ef-d076-4754-b2e9-db935995c8c0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:20:42 crc kubenswrapper[4792]: I0309 09:20:42.442683 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95e345ef-d076-4754-b2e9-db935995c8c0-kube-api-access-m5j27" (OuterVolumeSpecName: "kube-api-access-m5j27") pod "95e345ef-d076-4754-b2e9-db935995c8c0" (UID: "95e345ef-d076-4754-b2e9-db935995c8c0"). InnerVolumeSpecName "kube-api-access-m5j27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:20:42 crc kubenswrapper[4792]: I0309 09:20:42.448163 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95e345ef-d076-4754-b2e9-db935995c8c0-util" (OuterVolumeSpecName: "util") pod "95e345ef-d076-4754-b2e9-db935995c8c0" (UID: "95e345ef-d076-4754-b2e9-db935995c8c0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:20:42 crc kubenswrapper[4792]: I0309 09:20:42.537747 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5j27\" (UniqueName: \"kubernetes.io/projected/95e345ef-d076-4754-b2e9-db935995c8c0-kube-api-access-m5j27\") on node \"crc\" DevicePath \"\"" Mar 09 09:20:42 crc kubenswrapper[4792]: I0309 09:20:42.537784 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95e345ef-d076-4754-b2e9-db935995c8c0-util\") on node \"crc\" DevicePath \"\"" Mar 09 09:20:42 crc kubenswrapper[4792]: I0309 09:20:42.537796 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95e345ef-d076-4754-b2e9-db935995c8c0-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:20:43 crc kubenswrapper[4792]: I0309 09:20:43.074820 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" event={"ID":"95e345ef-d076-4754-b2e9-db935995c8c0","Type":"ContainerDied","Data":"745ef2c7b2675a8e68e08cb73662efecfc28c782803988a906b70c37553bc5c8"} Mar 09 09:20:43 crc kubenswrapper[4792]: I0309 09:20:43.074889 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="745ef2c7b2675a8e68e08cb73662efecfc28c782803988a906b70c37553bc5c8" Mar 09 09:20:43 crc kubenswrapper[4792]: I0309 09:20:43.074980 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92" Mar 09 09:20:43 crc kubenswrapper[4792]: I0309 09:20:43.214737 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:20:43 crc kubenswrapper[4792]: I0309 09:20:43.214832 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.374780 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-rqfwt"] Mar 09 09:20:46 crc kubenswrapper[4792]: E0309 09:20:46.376228 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e345ef-d076-4754-b2e9-db935995c8c0" containerName="extract" Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.376305 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e345ef-d076-4754-b2e9-db935995c8c0" containerName="extract" Mar 09 09:20:46 crc kubenswrapper[4792]: E0309 09:20:46.376371 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e345ef-d076-4754-b2e9-db935995c8c0" containerName="util" Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.376432 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e345ef-d076-4754-b2e9-db935995c8c0" containerName="util" Mar 09 09:20:46 crc kubenswrapper[4792]: E0309 09:20:46.376508 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e345ef-d076-4754-b2e9-db935995c8c0" containerName="pull" Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.376560 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e345ef-d076-4754-b2e9-db935995c8c0" containerName="pull" Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.376718 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="95e345ef-d076-4754-b2e9-db935995c8c0" containerName="extract" Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.377207 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rqfwt" Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.380195 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-dp4lv" Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.380265 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.380980 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.396889 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-rqfwt"] Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.493097 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9fp5\" (UniqueName: \"kubernetes.io/projected/c08f74f8-f8d6-48e8-bde0-9369c92969b0-kube-api-access-v9fp5\") pod \"nmstate-operator-75c5dccd6c-rqfwt\" (UID: \"c08f74f8-f8d6-48e8-bde0-9369c92969b0\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rqfwt" Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.594544 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9fp5\" (UniqueName: \"kubernetes.io/projected/c08f74f8-f8d6-48e8-bde0-9369c92969b0-kube-api-access-v9fp5\") pod \"nmstate-operator-75c5dccd6c-rqfwt\" (UID: \"c08f74f8-f8d6-48e8-bde0-9369c92969b0\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rqfwt" Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.626575 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9fp5\" (UniqueName: \"kubernetes.io/projected/c08f74f8-f8d6-48e8-bde0-9369c92969b0-kube-api-access-v9fp5\") pod \"nmstate-operator-75c5dccd6c-rqfwt\" (UID: \"c08f74f8-f8d6-48e8-bde0-9369c92969b0\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rqfwt" Mar 09 09:20:46 crc kubenswrapper[4792]: I0309 09:20:46.693191 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rqfwt" Mar 09 09:20:47 crc kubenswrapper[4792]: I0309 09:20:47.133329 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-rqfwt"] Mar 09 09:20:47 crc kubenswrapper[4792]: I0309 09:20:47.635303 4792 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 09:20:48 crc kubenswrapper[4792]: I0309 09:20:48.102604 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rqfwt" event={"ID":"c08f74f8-f8d6-48e8-bde0-9369c92969b0","Type":"ContainerStarted","Data":"2281178a37be8312dce125c4237626d257c28467bb729eeba9c84415646fd67d"} Mar 09 09:20:50 crc kubenswrapper[4792]: I0309 09:20:50.118766 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rqfwt" event={"ID":"c08f74f8-f8d6-48e8-bde0-9369c92969b0","Type":"ContainerStarted","Data":"faebcddde3e68cd57a98240d219b976f52d401d617e960db0aa8524fa45da764"} Mar 09 09:20:50 crc kubenswrapper[4792]: I0309 09:20:50.141933 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-rqfwt" podStartSLOduration=1.7244763760000001 podStartE2EDuration="4.141911343s" podCreationTimestamp="2026-03-09 09:20:46 +0000 UTC" firstStartedPulling="2026-03-09 09:20:47.142285386 +0000 UTC m=+812.172486138" lastFinishedPulling="2026-03-09 09:20:49.559720353 +0000 UTC m=+814.589921105" observedRunningTime="2026-03-09 09:20:50.14148512 +0000 UTC m=+815.171685882" watchObservedRunningTime="2026-03-09 09:20:50.141911343 +0000 UTC m=+815.172112105" Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.871011 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-rwpn2"] Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.872617 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-rwpn2" Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.874594 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5"] Mar 09 09:20:54 crc kubenswrapper[4792]: W0309 09:20:54.874644 4792 reflector.go:561] object-"openshift-nmstate"/"nmstate-handler-dockercfg-95mj9": failed to list *v1.Secret: secrets "nmstate-handler-dockercfg-95mj9" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Mar 09 09:20:54 crc kubenswrapper[4792]: E0309 09:20:54.874682 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"nmstate-handler-dockercfg-95mj9\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"nmstate-handler-dockercfg-95mj9\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.875294 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.877971 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.891579 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5"] Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.908326 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-5bwpq"] Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.909022 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.922237 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-rwpn2"] Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.995438 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7370c580-bd4f-4659-8fe6-79d9f8b31c05-nmstate-lock\") pod \"nmstate-handler-5bwpq\" (UID: \"7370c580-bd4f-4659-8fe6-79d9f8b31c05\") " pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.995490 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf5r5\" (UniqueName: \"kubernetes.io/projected/0681a6fd-5531-4a3c-b2d8-59dfecd186c2-kube-api-access-xf5r5\") pod \"nmstate-metrics-69594cc75-rwpn2\" (UID: \"0681a6fd-5531-4a3c-b2d8-59dfecd186c2\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-rwpn2" Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.995520 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8fdeb8b-8024-4916-b835-83a6da0b4ced-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-bsxt5\" (UID: \"a8fdeb8b-8024-4916-b835-83a6da0b4ced\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.995534 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7370c580-bd4f-4659-8fe6-79d9f8b31c05-dbus-socket\") pod \"nmstate-handler-5bwpq\" (UID: \"7370c580-bd4f-4659-8fe6-79d9f8b31c05\") " pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.995548 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7445r\" (UniqueName: \"kubernetes.io/projected/7370c580-bd4f-4659-8fe6-79d9f8b31c05-kube-api-access-7445r\") pod \"nmstate-handler-5bwpq\" (UID: \"7370c580-bd4f-4659-8fe6-79d9f8b31c05\") " pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.995566 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7370c580-bd4f-4659-8fe6-79d9f8b31c05-ovs-socket\") pod \"nmstate-handler-5bwpq\" (UID: \"7370c580-bd4f-4659-8fe6-79d9f8b31c05\") " pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:54 crc kubenswrapper[4792]: I0309 09:20:54.995582 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmsth\" (UniqueName: \"kubernetes.io/projected/a8fdeb8b-8024-4916-b835-83a6da0b4ced-kube-api-access-dmsth\") pod \"nmstate-webhook-786f45cff4-bsxt5\" (UID: \"a8fdeb8b-8024-4916-b835-83a6da0b4ced\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.047058 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5"] Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.047670 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.049893 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.050101 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-ftc8j" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.050567 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.106747 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmsth\" (UniqueName: \"kubernetes.io/projected/a8fdeb8b-8024-4916-b835-83a6da0b4ced-kube-api-access-dmsth\") pod \"nmstate-webhook-786f45cff4-bsxt5\" (UID: \"a8fdeb8b-8024-4916-b835-83a6da0b4ced\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.107023 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7370c580-bd4f-4659-8fe6-79d9f8b31c05-nmstate-lock\") pod \"nmstate-handler-5bwpq\" (UID: \"7370c580-bd4f-4659-8fe6-79d9f8b31c05\") " pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.107122 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf5r5\" (UniqueName: \"kubernetes.io/projected/0681a6fd-5531-4a3c-b2d8-59dfecd186c2-kube-api-access-xf5r5\") pod \"nmstate-metrics-69594cc75-rwpn2\" (UID: \"0681a6fd-5531-4a3c-b2d8-59dfecd186c2\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-rwpn2" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.107226 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8fdeb8b-8024-4916-b835-83a6da0b4ced-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-bsxt5\" (UID: \"a8fdeb8b-8024-4916-b835-83a6da0b4ced\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.107466 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7370c580-bd4f-4659-8fe6-79d9f8b31c05-dbus-socket\") pod \"nmstate-handler-5bwpq\" (UID: \"7370c580-bd4f-4659-8fe6-79d9f8b31c05\") " pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.107539 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7445r\" (UniqueName: \"kubernetes.io/projected/7370c580-bd4f-4659-8fe6-79d9f8b31c05-kube-api-access-7445r\") pod \"nmstate-handler-5bwpq\" (UID: \"7370c580-bd4f-4659-8fe6-79d9f8b31c05\") " pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.107642 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7370c580-bd4f-4659-8fe6-79d9f8b31c05-ovs-socket\") pod \"nmstate-handler-5bwpq\" (UID: \"7370c580-bd4f-4659-8fe6-79d9f8b31c05\") " pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.107769 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7370c580-bd4f-4659-8fe6-79d9f8b31c05-ovs-socket\") pod \"nmstate-handler-5bwpq\" (UID: \"7370c580-bd4f-4659-8fe6-79d9f8b31c05\") " pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.108901 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7370c580-bd4f-4659-8fe6-79d9f8b31c05-nmstate-lock\") pod \"nmstate-handler-5bwpq\" (UID: \"7370c580-bd4f-4659-8fe6-79d9f8b31c05\") " pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:55 crc kubenswrapper[4792]: E0309 09:20:55.109600 4792 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 09 09:20:55 crc kubenswrapper[4792]: E0309 09:20:55.110328 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8fdeb8b-8024-4916-b835-83a6da0b4ced-tls-key-pair podName:a8fdeb8b-8024-4916-b835-83a6da0b4ced nodeName:}" failed. No retries permitted until 2026-03-09 09:20:55.610311204 +0000 UTC m=+820.640511956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/a8fdeb8b-8024-4916-b835-83a6da0b4ced-tls-key-pair") pod "nmstate-webhook-786f45cff4-bsxt5" (UID: "a8fdeb8b-8024-4916-b835-83a6da0b4ced") : secret "openshift-nmstate-webhook" not found Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.109764 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7370c580-bd4f-4659-8fe6-79d9f8b31c05-dbus-socket\") pod \"nmstate-handler-5bwpq\" (UID: \"7370c580-bd4f-4659-8fe6-79d9f8b31c05\") " pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.113710 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5"] Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.132178 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf5r5\" (UniqueName: \"kubernetes.io/projected/0681a6fd-5531-4a3c-b2d8-59dfecd186c2-kube-api-access-xf5r5\") pod \"nmstate-metrics-69594cc75-rwpn2\" (UID: \"0681a6fd-5531-4a3c-b2d8-59dfecd186c2\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-rwpn2" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.135453 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7445r\" (UniqueName: \"kubernetes.io/projected/7370c580-bd4f-4659-8fe6-79d9f8b31c05-kube-api-access-7445r\") pod \"nmstate-handler-5bwpq\" (UID: \"7370c580-bd4f-4659-8fe6-79d9f8b31c05\") " pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.138648 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmsth\" (UniqueName: \"kubernetes.io/projected/a8fdeb8b-8024-4916-b835-83a6da0b4ced-kube-api-access-dmsth\") pod \"nmstate-webhook-786f45cff4-bsxt5\" (UID: \"a8fdeb8b-8024-4916-b835-83a6da0b4ced\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.208399 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dffb3a22-ee53-4b05-921e-bf92456a5518-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5hgb5\" (UID: \"dffb3a22-ee53-4b05-921e-bf92456a5518\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.208515 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dffb3a22-ee53-4b05-921e-bf92456a5518-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5hgb5\" (UID: \"dffb3a22-ee53-4b05-921e-bf92456a5518\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.208542 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2849\" (UniqueName: \"kubernetes.io/projected/dffb3a22-ee53-4b05-921e-bf92456a5518-kube-api-access-h2849\") pod \"nmstate-console-plugin-5dcbbd79cf-5hgb5\" (UID: \"dffb3a22-ee53-4b05-921e-bf92456a5518\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.309655 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dffb3a22-ee53-4b05-921e-bf92456a5518-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5hgb5\" (UID: \"dffb3a22-ee53-4b05-921e-bf92456a5518\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.310017 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2849\" (UniqueName: \"kubernetes.io/projected/dffb3a22-ee53-4b05-921e-bf92456a5518-kube-api-access-h2849\") pod \"nmstate-console-plugin-5dcbbd79cf-5hgb5\" (UID: \"dffb3a22-ee53-4b05-921e-bf92456a5518\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.310056 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dffb3a22-ee53-4b05-921e-bf92456a5518-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5hgb5\" (UID: \"dffb3a22-ee53-4b05-921e-bf92456a5518\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.310884 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dffb3a22-ee53-4b05-921e-bf92456a5518-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5hgb5\" (UID: \"dffb3a22-ee53-4b05-921e-bf92456a5518\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.319825 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dffb3a22-ee53-4b05-921e-bf92456a5518-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5hgb5\" (UID: \"dffb3a22-ee53-4b05-921e-bf92456a5518\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.353091 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2849\" (UniqueName: \"kubernetes.io/projected/dffb3a22-ee53-4b05-921e-bf92456a5518-kube-api-access-h2849\") pod \"nmstate-console-plugin-5dcbbd79cf-5hgb5\" (UID: \"dffb3a22-ee53-4b05-921e-bf92456a5518\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.359291 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.374342 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c798f89cd-9ggkq"] Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.375284 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.437350 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c798f89cd-9ggkq"] Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.512263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8cd6d484-fe09-4086-a079-0f65d36aa252-service-ca\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.512299 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8cd6d484-fe09-4086-a079-0f65d36aa252-console-config\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.512321 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8cd6d484-fe09-4086-a079-0f65d36aa252-oauth-serving-cert\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.512408 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8cd6d484-fe09-4086-a079-0f65d36aa252-console-oauth-config\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.512468 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd6d484-fe09-4086-a079-0f65d36aa252-trusted-ca-bundle\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.512558 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8cd6d484-fe09-4086-a079-0f65d36aa252-console-serving-cert\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.512596 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k84d8\" (UniqueName: \"kubernetes.io/projected/8cd6d484-fe09-4086-a079-0f65d36aa252-kube-api-access-k84d8\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.602274 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5"] Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.614308 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8cd6d484-fe09-4086-a079-0f65d36aa252-console-serving-cert\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.614372 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k84d8\" (UniqueName: \"kubernetes.io/projected/8cd6d484-fe09-4086-a079-0f65d36aa252-kube-api-access-k84d8\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.614400 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8cd6d484-fe09-4086-a079-0f65d36aa252-service-ca\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.614424 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8cd6d484-fe09-4086-a079-0f65d36aa252-console-config\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.614450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8cd6d484-fe09-4086-a079-0f65d36aa252-oauth-serving-cert\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.614482 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8fdeb8b-8024-4916-b835-83a6da0b4ced-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-bsxt5\" (UID: \"a8fdeb8b-8024-4916-b835-83a6da0b4ced\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.614514 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8cd6d484-fe09-4086-a079-0f65d36aa252-console-oauth-config\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.614552 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd6d484-fe09-4086-a079-0f65d36aa252-trusted-ca-bundle\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.615514 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8cd6d484-fe09-4086-a079-0f65d36aa252-service-ca\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.615926 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8cd6d484-fe09-4086-a079-0f65d36aa252-console-config\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.615970 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd6d484-fe09-4086-a079-0f65d36aa252-trusted-ca-bundle\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.617224 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8cd6d484-fe09-4086-a079-0f65d36aa252-oauth-serving-cert\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.618550 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8cd6d484-fe09-4086-a079-0f65d36aa252-console-oauth-config\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.621210 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8cd6d484-fe09-4086-a079-0f65d36aa252-console-serving-cert\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.621300 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a8fdeb8b-8024-4916-b835-83a6da0b4ced-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-bsxt5\" (UID: \"a8fdeb8b-8024-4916-b835-83a6da0b4ced\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.630923 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k84d8\" (UniqueName: \"kubernetes.io/projected/8cd6d484-fe09-4086-a079-0f65d36aa252-kube-api-access-k84d8\") pod \"console-c798f89cd-9ggkq\" (UID: \"8cd6d484-fe09-4086-a079-0f65d36aa252\") " pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.710674 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:20:55 crc kubenswrapper[4792]: I0309 09:20:55.930919 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c798f89cd-9ggkq"] Mar 09 09:20:56 crc kubenswrapper[4792]: I0309 09:20:56.155979 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" event={"ID":"dffb3a22-ee53-4b05-921e-bf92456a5518","Type":"ContainerStarted","Data":"81e3fe551642869b60dc5472d6dab671a92e1ba204b2d5c68b9f7e5556f8d81e"} Mar 09 09:20:56 crc kubenswrapper[4792]: I0309 09:20:56.157698 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c798f89cd-9ggkq" event={"ID":"8cd6d484-fe09-4086-a079-0f65d36aa252","Type":"ContainerStarted","Data":"5a18421375088b35f950eccfc2656119103504d22df06d99e20db7fbdd8d9a3a"} Mar 09 09:20:56 crc kubenswrapper[4792]: I0309 09:20:56.157727 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c798f89cd-9ggkq" event={"ID":"8cd6d484-fe09-4086-a079-0f65d36aa252","Type":"ContainerStarted","Data":"71bf81c6e48f2f7f1b315ac41c75dc7216ecf30483f4e9b54f3e7598f072c5f6"} Mar 09 09:20:56 crc kubenswrapper[4792]: I0309 09:20:56.174670 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c798f89cd-9ggkq" podStartSLOduration=1.174648889 podStartE2EDuration="1.174648889s" podCreationTimestamp="2026-03-09 09:20:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:20:56.171880077 +0000 UTC m=+821.202080839" watchObservedRunningTime="2026-03-09 09:20:56.174648889 +0000 UTC m=+821.204849641" Mar 09 09:20:56 crc kubenswrapper[4792]: I0309 09:20:56.225955 4792 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-nmstate/nmstate-metrics-69594cc75-rwpn2" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 09 09:20:56 crc kubenswrapper[4792]: I0309 09:20:56.226059 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-rwpn2" Mar 09 09:20:56 crc kubenswrapper[4792]: I0309 09:20:56.261979 4792 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-nmstate/nmstate-handler-5bwpq" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 09 09:20:56 crc kubenswrapper[4792]: I0309 09:20:56.262259 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:20:56 crc kubenswrapper[4792]: W0309 09:20:56.290721 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7370c580_bd4f_4659_8fe6_79d9f8b31c05.slice/crio-816103f6176105beae271aeb2798374ede9142edfa11a2f829a68c3ce1b4c065 WatchSource:0}: Error finding container 816103f6176105beae271aeb2798374ede9142edfa11a2f829a68c3ce1b4c065: Status 404 returned error can't find the container with id 816103f6176105beae271aeb2798374ede9142edfa11a2f829a68c3ce1b4c065 Mar 09 09:20:56 crc kubenswrapper[4792]: I0309 09:20:56.305170 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-95mj9" Mar 09 09:20:56 crc kubenswrapper[4792]: I0309 09:20:56.305209 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" Mar 09 09:20:56 crc kubenswrapper[4792]: I0309 09:20:56.522365 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-rwpn2"] Mar 09 09:20:56 crc kubenswrapper[4792]: W0309 09:20:56.530677 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0681a6fd_5531_4a3c_b2d8_59dfecd186c2.slice/crio-817c56e6106e39a8c13c7cb2dd4076c496decd980eea5cf118e0e256b50397a9 WatchSource:0}: Error finding container 817c56e6106e39a8c13c7cb2dd4076c496decd980eea5cf118e0e256b50397a9: Status 404 returned error can't find the container with id 817c56e6106e39a8c13c7cb2dd4076c496decd980eea5cf118e0e256b50397a9 Mar 09 09:20:56 crc kubenswrapper[4792]: I0309 09:20:56.561923 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5"] Mar 09 09:20:56 crc kubenswrapper[4792]: W0309 09:20:56.570718 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8fdeb8b_8024_4916_b835_83a6da0b4ced.slice/crio-9bad46a31d82c93989ad5785012079265aa9eeb4244ff0a299912a5a78103a6d WatchSource:0}: Error finding container 9bad46a31d82c93989ad5785012079265aa9eeb4244ff0a299912a5a78103a6d: Status 404 returned error can't find the container with id 9bad46a31d82c93989ad5785012079265aa9eeb4244ff0a299912a5a78103a6d Mar 09 09:20:57 crc kubenswrapper[4792]: I0309 09:20:57.169906 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" event={"ID":"a8fdeb8b-8024-4916-b835-83a6da0b4ced","Type":"ContainerStarted","Data":"9bad46a31d82c93989ad5785012079265aa9eeb4244ff0a299912a5a78103a6d"} Mar 09 09:20:57 crc kubenswrapper[4792]: I0309 09:20:57.171248 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-rwpn2" event={"ID":"0681a6fd-5531-4a3c-b2d8-59dfecd186c2","Type":"ContainerStarted","Data":"817c56e6106e39a8c13c7cb2dd4076c496decd980eea5cf118e0e256b50397a9"} Mar 09 09:20:57 crc kubenswrapper[4792]: I0309 09:20:57.172465 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5bwpq" event={"ID":"7370c580-bd4f-4659-8fe6-79d9f8b31c05","Type":"ContainerStarted","Data":"816103f6176105beae271aeb2798374ede9142edfa11a2f829a68c3ce1b4c065"} Mar 09 09:20:59 crc kubenswrapper[4792]: I0309 09:20:59.185661 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" event={"ID":"dffb3a22-ee53-4b05-921e-bf92456a5518","Type":"ContainerStarted","Data":"da5faa08aefb7af7123fde9c39a5e9cefa4ce52e873be34dbd02dd0f743a4c06"} Mar 09 09:20:59 crc kubenswrapper[4792]: I0309 09:20:59.188715 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" event={"ID":"a8fdeb8b-8024-4916-b835-83a6da0b4ced","Type":"ContainerStarted","Data":"a010aafb95b8b6ddd61d98cdbf5bbeb971569b585665843847f377ed199982b4"} Mar 09 09:20:59 crc kubenswrapper[4792]: I0309 09:20:59.189304 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" Mar 09 09:20:59 crc kubenswrapper[4792]: I0309 09:20:59.190601 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-rwpn2" event={"ID":"0681a6fd-5531-4a3c-b2d8-59dfecd186c2","Type":"ContainerStarted","Data":"fe708b5083962a6e5cbb4108f5c9feed74976c84f604d02caae1ed61266743cb"} Mar 09 09:20:59 crc kubenswrapper[4792]: I0309 09:20:59.202782 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5hgb5" podStartSLOduration=1.753115529 podStartE2EDuration="4.202768331s" podCreationTimestamp="2026-03-09 09:20:55 +0000 UTC" firstStartedPulling="2026-03-09 09:20:55.610948457 +0000 UTC m=+820.641149209" lastFinishedPulling="2026-03-09 09:20:58.060601259 +0000 UTC m=+823.090802011" observedRunningTime="2026-03-09 09:20:59.198885146 +0000 UTC m=+824.229085898" watchObservedRunningTime="2026-03-09 09:20:59.202768331 +0000 UTC m=+824.232969083" Mar 09 09:20:59 crc kubenswrapper[4792]: I0309 09:20:59.225811 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" podStartSLOduration=2.831713968 podStartE2EDuration="5.225793893s" podCreationTimestamp="2026-03-09 09:20:54 +0000 UTC" firstStartedPulling="2026-03-09 09:20:56.572899679 +0000 UTC m=+821.603100431" lastFinishedPulling="2026-03-09 09:20:58.966979604 +0000 UTC m=+823.997180356" observedRunningTime="2026-03-09 09:20:59.224294139 +0000 UTC m=+824.254494911" watchObservedRunningTime="2026-03-09 09:20:59.225793893 +0000 UTC m=+824.255994645" Mar 09 09:21:00 crc kubenswrapper[4792]: I0309 09:21:00.195990 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5bwpq" event={"ID":"7370c580-bd4f-4659-8fe6-79d9f8b31c05","Type":"ContainerStarted","Data":"9a7fc1e3d869a72e9fc40d944f82e3d7661f649686b59a49c34cb8a85fba5ac2"} Mar 09 09:21:00 crc kubenswrapper[4792]: I0309 09:21:00.210888 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-5bwpq" podStartSLOduration=3.559531013 podStartE2EDuration="6.21086812s" podCreationTimestamp="2026-03-09 09:20:54 +0000 UTC" firstStartedPulling="2026-03-09 09:20:56.293184692 +0000 UTC m=+821.323385444" lastFinishedPulling="2026-03-09 09:20:58.944521799 +0000 UTC m=+823.974722551" observedRunningTime="2026-03-09 09:21:00.208606003 +0000 UTC m=+825.238806755" watchObservedRunningTime="2026-03-09 09:21:00.21086812 +0000 UTC m=+825.241068872" Mar 09 09:21:01 crc kubenswrapper[4792]: I0309 09:21:01.201537 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:21:05 crc kubenswrapper[4792]: I0309 09:21:05.225837 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-rwpn2" event={"ID":"0681a6fd-5531-4a3c-b2d8-59dfecd186c2","Type":"ContainerStarted","Data":"72e76ff7b4c698d3887159d367058865367405babc69393da92365081ad8c1df"} Mar 09 09:21:05 crc kubenswrapper[4792]: I0309 09:21:05.239453 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-rwpn2" podStartSLOduration=3.268354095 podStartE2EDuration="11.239435692s" podCreationTimestamp="2026-03-09 09:20:54 +0000 UTC" firstStartedPulling="2026-03-09 09:20:56.533609345 +0000 UTC m=+821.563810097" lastFinishedPulling="2026-03-09 09:21:04.504690942 +0000 UTC m=+829.534891694" observedRunningTime="2026-03-09 09:21:05.236941548 +0000 UTC m=+830.267142300" watchObservedRunningTime="2026-03-09 09:21:05.239435692 +0000 UTC m=+830.269636444" Mar 09 09:21:05 crc kubenswrapper[4792]: I0309 09:21:05.711710 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:21:05 crc kubenswrapper[4792]: I0309 09:21:05.711797 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:21:05 crc kubenswrapper[4792]: I0309 09:21:05.716509 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:21:06 crc kubenswrapper[4792]: I0309 09:21:06.237993 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c798f89cd-9ggkq" Mar 09 09:21:06 crc kubenswrapper[4792]: I0309 09:21:06.285082 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jh5pl"] Mar 09 09:21:06 crc kubenswrapper[4792]: I0309 09:21:06.352036 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-5bwpq" Mar 09 09:21:13 crc kubenswrapper[4792]: I0309 09:21:13.214628 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:21:13 crc kubenswrapper[4792]: I0309 09:21:13.214936 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:21:13 crc kubenswrapper[4792]: I0309 09:21:13.214981 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:21:13 crc kubenswrapper[4792]: I0309 09:21:13.215603 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ebe6e06d3acdb8dc390125e4fb4991f20773eff67765cf4ee7e42fe0e4d4167"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:21:13 crc kubenswrapper[4792]: I0309 09:21:13.215656 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://7ebe6e06d3acdb8dc390125e4fb4991f20773eff67765cf4ee7e42fe0e4d4167" gracePeriod=600 Mar 09 09:21:14 crc kubenswrapper[4792]: I0309 09:21:14.292336 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="7ebe6e06d3acdb8dc390125e4fb4991f20773eff67765cf4ee7e42fe0e4d4167" exitCode=0 Mar 09 09:21:14 crc kubenswrapper[4792]: I0309 09:21:14.292434 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"7ebe6e06d3acdb8dc390125e4fb4991f20773eff67765cf4ee7e42fe0e4d4167"} Mar 09 09:21:14 crc kubenswrapper[4792]: I0309 09:21:14.293027 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"db2023e6b3ec28be4276e65d0d9cd090ae22fa8851acb261970e9cecf046c144"} Mar 09 09:21:14 crc kubenswrapper[4792]: I0309 09:21:14.293056 4792 scope.go:117] "RemoveContainer" containerID="6e17fbe8c1658cdfd2034c0251ad7b1ba86ff152d37adcf43ef52a7d2b0de4db" Mar 09 09:21:16 crc kubenswrapper[4792]: I0309 09:21:16.310936 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bsxt5" Mar 09 09:21:26 crc kubenswrapper[4792]: I0309 09:21:26.499858 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jzgx2"] Mar 09 09:21:26 crc kubenswrapper[4792]: I0309 09:21:26.508219 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:26 crc kubenswrapper[4792]: I0309 09:21:26.542346 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzgx2"] Mar 09 09:21:26 crc kubenswrapper[4792]: I0309 09:21:26.642995 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0123630e-dc7f-49f0-9a9f-13146cec7a10-utilities\") pod \"redhat-marketplace-jzgx2\" (UID: \"0123630e-dc7f-49f0-9a9f-13146cec7a10\") " pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:26 crc kubenswrapper[4792]: I0309 09:21:26.643045 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t48br\" (UniqueName: \"kubernetes.io/projected/0123630e-dc7f-49f0-9a9f-13146cec7a10-kube-api-access-t48br\") pod \"redhat-marketplace-jzgx2\" (UID: \"0123630e-dc7f-49f0-9a9f-13146cec7a10\") " pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:26 crc kubenswrapper[4792]: I0309 09:21:26.643090 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0123630e-dc7f-49f0-9a9f-13146cec7a10-catalog-content\") pod \"redhat-marketplace-jzgx2\" (UID: \"0123630e-dc7f-49f0-9a9f-13146cec7a10\") " pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:26 crc kubenswrapper[4792]: I0309 09:21:26.743995 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0123630e-dc7f-49f0-9a9f-13146cec7a10-utilities\") pod \"redhat-marketplace-jzgx2\" (UID: \"0123630e-dc7f-49f0-9a9f-13146cec7a10\") " pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:26 crc kubenswrapper[4792]: I0309 09:21:26.744043 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t48br\" (UniqueName: \"kubernetes.io/projected/0123630e-dc7f-49f0-9a9f-13146cec7a10-kube-api-access-t48br\") pod \"redhat-marketplace-jzgx2\" (UID: \"0123630e-dc7f-49f0-9a9f-13146cec7a10\") " pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:26 crc kubenswrapper[4792]: I0309 09:21:26.744063 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0123630e-dc7f-49f0-9a9f-13146cec7a10-catalog-content\") pod \"redhat-marketplace-jzgx2\" (UID: \"0123630e-dc7f-49f0-9a9f-13146cec7a10\") " pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:26 crc kubenswrapper[4792]: I0309 09:21:26.744648 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0123630e-dc7f-49f0-9a9f-13146cec7a10-utilities\") pod \"redhat-marketplace-jzgx2\" (UID: \"0123630e-dc7f-49f0-9a9f-13146cec7a10\") " pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:26 crc kubenswrapper[4792]: I0309 09:21:26.745162 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0123630e-dc7f-49f0-9a9f-13146cec7a10-catalog-content\") pod \"redhat-marketplace-jzgx2\" (UID: \"0123630e-dc7f-49f0-9a9f-13146cec7a10\") " pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:26 crc kubenswrapper[4792]: I0309 09:21:26.774922 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t48br\" (UniqueName: \"kubernetes.io/projected/0123630e-dc7f-49f0-9a9f-13146cec7a10-kube-api-access-t48br\") pod \"redhat-marketplace-jzgx2\" (UID: \"0123630e-dc7f-49f0-9a9f-13146cec7a10\") " pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:26 crc kubenswrapper[4792]: I0309 09:21:26.842386 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:27 crc kubenswrapper[4792]: I0309 09:21:27.282172 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzgx2"] Mar 09 09:21:27 crc kubenswrapper[4792]: W0309 09:21:27.289380 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0123630e_dc7f_49f0_9a9f_13146cec7a10.slice/crio-dac366d724d42295fe9b806c4857e3a87d653b50131cb795dbc1444292d68e2a WatchSource:0}: Error finding container dac366d724d42295fe9b806c4857e3a87d653b50131cb795dbc1444292d68e2a: Status 404 returned error can't find the container with id dac366d724d42295fe9b806c4857e3a87d653b50131cb795dbc1444292d68e2a Mar 09 09:21:27 crc kubenswrapper[4792]: I0309 09:21:27.368845 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzgx2" event={"ID":"0123630e-dc7f-49f0-9a9f-13146cec7a10","Type":"ContainerStarted","Data":"dac366d724d42295fe9b806c4857e3a87d653b50131cb795dbc1444292d68e2a"} Mar 09 09:21:28 crc kubenswrapper[4792]: I0309 09:21:28.376232 4792 generic.go:334] "Generic (PLEG): container finished" podID="0123630e-dc7f-49f0-9a9f-13146cec7a10" containerID="2e942a81dbad387994bffbdb98a57a643a343fd6e182048d3e09dfd5e4cba7db" exitCode=0 Mar 09 09:21:28 crc kubenswrapper[4792]: I0309 09:21:28.376277 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzgx2" event={"ID":"0123630e-dc7f-49f0-9a9f-13146cec7a10","Type":"ContainerDied","Data":"2e942a81dbad387994bffbdb98a57a643a343fd6e182048d3e09dfd5e4cba7db"} Mar 09 09:21:28 crc kubenswrapper[4792]: I0309 09:21:28.378888 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:21:29 crc kubenswrapper[4792]: I0309 09:21:29.384153 4792 generic.go:334] "Generic (PLEG): container finished" podID="0123630e-dc7f-49f0-9a9f-13146cec7a10" containerID="fba5d746920aecf244c35d7a65104514b0bde8667d1b8ae66e2366866dffc130" exitCode=0 Mar 09 09:21:29 crc kubenswrapper[4792]: I0309 09:21:29.384344 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzgx2" event={"ID":"0123630e-dc7f-49f0-9a9f-13146cec7a10","Type":"ContainerDied","Data":"fba5d746920aecf244c35d7a65104514b0bde8667d1b8ae66e2366866dffc130"} Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.307946 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6"] Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.309312 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.311922 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.359635 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6"] Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.386034 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3419b911-375b-44c5-8be3-074ce9531ac5-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6\" (UID: \"3419b911-375b-44c5-8be3-074ce9531ac5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.386101 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3419b911-375b-44c5-8be3-074ce9531ac5-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6\" (UID: \"3419b911-375b-44c5-8be3-074ce9531ac5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.386182 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r9z5\" (UniqueName: \"kubernetes.io/projected/3419b911-375b-44c5-8be3-074ce9531ac5-kube-api-access-9r9z5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6\" (UID: \"3419b911-375b-44c5-8be3-074ce9531ac5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.393290 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzgx2" event={"ID":"0123630e-dc7f-49f0-9a9f-13146cec7a10","Type":"ContainerStarted","Data":"689de895fb0174e98f6bdaca176efff2dbfd4fad885bfa42ca2782238c615bff"} Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.417442 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jzgx2" podStartSLOduration=2.992403881 podStartE2EDuration="4.417424365s" podCreationTimestamp="2026-03-09 09:21:26 +0000 UTC" firstStartedPulling="2026-03-09 09:21:28.378219096 +0000 UTC m=+853.408419868" lastFinishedPulling="2026-03-09 09:21:29.8032396 +0000 UTC m=+854.833440352" observedRunningTime="2026-03-09 09:21:30.41281205 +0000 UTC m=+855.443012812" watchObservedRunningTime="2026-03-09 09:21:30.417424365 +0000 UTC m=+855.447625117" Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.487524 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3419b911-375b-44c5-8be3-074ce9531ac5-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6\" (UID: \"3419b911-375b-44c5-8be3-074ce9531ac5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.487583 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3419b911-375b-44c5-8be3-074ce9531ac5-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6\" (UID: \"3419b911-375b-44c5-8be3-074ce9531ac5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.487621 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r9z5\" (UniqueName: \"kubernetes.io/projected/3419b911-375b-44c5-8be3-074ce9531ac5-kube-api-access-9r9z5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6\" (UID: \"3419b911-375b-44c5-8be3-074ce9531ac5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.488328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3419b911-375b-44c5-8be3-074ce9531ac5-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6\" (UID: \"3419b911-375b-44c5-8be3-074ce9531ac5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.488453 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3419b911-375b-44c5-8be3-074ce9531ac5-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6\" (UID: \"3419b911-375b-44c5-8be3-074ce9531ac5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.507918 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r9z5\" (UniqueName: \"kubernetes.io/projected/3419b911-375b-44c5-8be3-074ce9531ac5-kube-api-access-9r9z5\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6\" (UID: \"3419b911-375b-44c5-8be3-074ce9531ac5\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.624159 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" Mar 09 09:21:30 crc kubenswrapper[4792]: I0309 09:21:30.835494 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6"] Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.375124 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-jh5pl" podUID="894f7c69-0119-4c19-b205-9780fb52b06e" containerName="console" containerID="cri-o://fe99716a26831a4feb1cab8ab0e7ab21cfc6e045d9fc4351aa58551f5f881b40" gracePeriod=15 Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.398596 4792 generic.go:334] "Generic (PLEG): container finished" podID="3419b911-375b-44c5-8be3-074ce9531ac5" containerID="6a7cdaa06ceca0323e5f97f14d91b70cafd78656ddf34c6ff7889f732d0ebe1c" exitCode=0 Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.398679 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" event={"ID":"3419b911-375b-44c5-8be3-074ce9531ac5","Type":"ContainerDied","Data":"6a7cdaa06ceca0323e5f97f14d91b70cafd78656ddf34c6ff7889f732d0ebe1c"} Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.398848 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" event={"ID":"3419b911-375b-44c5-8be3-074ce9531ac5","Type":"ContainerStarted","Data":"79eaa7a7607afdabba0cdabe458f73edc2a9c7b346bced4e04e1d887398e1fbb"} Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.747693 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jh5pl_894f7c69-0119-4c19-b205-9780fb52b06e/console/0.log" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.747788 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.807119 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/894f7c69-0119-4c19-b205-9780fb52b06e-console-serving-cert\") pod \"894f7c69-0119-4c19-b205-9780fb52b06e\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.807186 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/894f7c69-0119-4c19-b205-9780fb52b06e-console-oauth-config\") pod \"894f7c69-0119-4c19-b205-9780fb52b06e\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.807233 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krplw\" (UniqueName: \"kubernetes.io/projected/894f7c69-0119-4c19-b205-9780fb52b06e-kube-api-access-krplw\") pod \"894f7c69-0119-4c19-b205-9780fb52b06e\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.807276 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-oauth-serving-cert\") pod \"894f7c69-0119-4c19-b205-9780fb52b06e\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.807383 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-console-config\") pod \"894f7c69-0119-4c19-b205-9780fb52b06e\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.807424 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-service-ca\") pod \"894f7c69-0119-4c19-b205-9780fb52b06e\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.807463 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-trusted-ca-bundle\") pod \"894f7c69-0119-4c19-b205-9780fb52b06e\" (UID: \"894f7c69-0119-4c19-b205-9780fb52b06e\") " Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.808593 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-console-config" (OuterVolumeSpecName: "console-config") pod "894f7c69-0119-4c19-b205-9780fb52b06e" (UID: "894f7c69-0119-4c19-b205-9780fb52b06e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.808625 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-service-ca" (OuterVolumeSpecName: "service-ca") pod "894f7c69-0119-4c19-b205-9780fb52b06e" (UID: "894f7c69-0119-4c19-b205-9780fb52b06e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.808639 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "894f7c69-0119-4c19-b205-9780fb52b06e" (UID: "894f7c69-0119-4c19-b205-9780fb52b06e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.808908 4792 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.808930 4792 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.808943 4792 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.809822 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "894f7c69-0119-4c19-b205-9780fb52b06e" (UID: "894f7c69-0119-4c19-b205-9780fb52b06e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.813831 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/894f7c69-0119-4c19-b205-9780fb52b06e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "894f7c69-0119-4c19-b205-9780fb52b06e" (UID: "894f7c69-0119-4c19-b205-9780fb52b06e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.814728 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/894f7c69-0119-4c19-b205-9780fb52b06e-kube-api-access-krplw" (OuterVolumeSpecName: "kube-api-access-krplw") pod "894f7c69-0119-4c19-b205-9780fb52b06e" (UID: "894f7c69-0119-4c19-b205-9780fb52b06e"). InnerVolumeSpecName "kube-api-access-krplw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.818589 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/894f7c69-0119-4c19-b205-9780fb52b06e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "894f7c69-0119-4c19-b205-9780fb52b06e" (UID: "894f7c69-0119-4c19-b205-9780fb52b06e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.909998 4792 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/894f7c69-0119-4c19-b205-9780fb52b06e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.910042 4792 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/894f7c69-0119-4c19-b205-9780fb52b06e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.910053 4792 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/894f7c69-0119-4c19-b205-9780fb52b06e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:31 crc kubenswrapper[4792]: I0309 09:21:31.910062 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krplw\" (UniqueName: \"kubernetes.io/projected/894f7c69-0119-4c19-b205-9780fb52b06e-kube-api-access-krplw\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:32 crc kubenswrapper[4792]: I0309 09:21:32.405718 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jh5pl_894f7c69-0119-4c19-b205-9780fb52b06e/console/0.log" Mar 09 09:21:32 crc kubenswrapper[4792]: I0309 09:21:32.406055 4792 generic.go:334] "Generic (PLEG): container finished" podID="894f7c69-0119-4c19-b205-9780fb52b06e" containerID="fe99716a26831a4feb1cab8ab0e7ab21cfc6e045d9fc4351aa58551f5f881b40" exitCode=2 Mar 09 09:21:32 crc kubenswrapper[4792]: I0309 09:21:32.406111 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jh5pl" Mar 09 09:21:32 crc kubenswrapper[4792]: I0309 09:21:32.406106 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jh5pl" event={"ID":"894f7c69-0119-4c19-b205-9780fb52b06e","Type":"ContainerDied","Data":"fe99716a26831a4feb1cab8ab0e7ab21cfc6e045d9fc4351aa58551f5f881b40"} Mar 09 09:21:32 crc kubenswrapper[4792]: I0309 09:21:32.406216 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jh5pl" event={"ID":"894f7c69-0119-4c19-b205-9780fb52b06e","Type":"ContainerDied","Data":"c0c53cdb88646265b987d527b693d1063543d83446fd0feeac46711ec26bab27"} Mar 09 09:21:32 crc kubenswrapper[4792]: I0309 09:21:32.406237 4792 scope.go:117] "RemoveContainer" containerID="fe99716a26831a4feb1cab8ab0e7ab21cfc6e045d9fc4351aa58551f5f881b40" Mar 09 09:21:32 crc kubenswrapper[4792]: I0309 09:21:32.424499 4792 scope.go:117] "RemoveContainer" containerID="fe99716a26831a4feb1cab8ab0e7ab21cfc6e045d9fc4351aa58551f5f881b40" Mar 09 09:21:32 crc kubenswrapper[4792]: E0309 09:21:32.424951 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe99716a26831a4feb1cab8ab0e7ab21cfc6e045d9fc4351aa58551f5f881b40\": container with ID starting with fe99716a26831a4feb1cab8ab0e7ab21cfc6e045d9fc4351aa58551f5f881b40 not found: ID does not exist" containerID="fe99716a26831a4feb1cab8ab0e7ab21cfc6e045d9fc4351aa58551f5f881b40" Mar 09 09:21:32 crc kubenswrapper[4792]: I0309 09:21:32.424991 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe99716a26831a4feb1cab8ab0e7ab21cfc6e045d9fc4351aa58551f5f881b40"} err="failed to get container status \"fe99716a26831a4feb1cab8ab0e7ab21cfc6e045d9fc4351aa58551f5f881b40\": rpc error: code = NotFound desc = could not find container \"fe99716a26831a4feb1cab8ab0e7ab21cfc6e045d9fc4351aa58551f5f881b40\": container with ID starting with fe99716a26831a4feb1cab8ab0e7ab21cfc6e045d9fc4351aa58551f5f881b40 not found: ID does not exist" Mar 09 09:21:32 crc kubenswrapper[4792]: I0309 09:21:32.443557 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jh5pl"] Mar 09 09:21:32 crc kubenswrapper[4792]: I0309 09:21:32.449626 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-jh5pl"] Mar 09 09:21:32 crc kubenswrapper[4792]: I0309 09:21:32.671996 4792 patch_prober.go:28] interesting pod/console-f9d7485db-jh5pl container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:21:32 crc kubenswrapper[4792]: I0309 09:21:32.672102 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-jh5pl" podUID="894f7c69-0119-4c19-b205-9780fb52b06e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:21:33 crc kubenswrapper[4792]: I0309 09:21:33.415241 4792 generic.go:334] "Generic (PLEG): container finished" podID="3419b911-375b-44c5-8be3-074ce9531ac5" containerID="9e710ca669b3b47ebd6e61db37ef254e756e4878737c0975a631a9ca1101cec0" exitCode=0 Mar 09 09:21:33 crc kubenswrapper[4792]: I0309 09:21:33.415484 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" event={"ID":"3419b911-375b-44c5-8be3-074ce9531ac5","Type":"ContainerDied","Data":"9e710ca669b3b47ebd6e61db37ef254e756e4878737c0975a631a9ca1101cec0"} Mar 09 09:21:33 crc kubenswrapper[4792]: I0309 09:21:33.674168 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="894f7c69-0119-4c19-b205-9780fb52b06e" path="/var/lib/kubelet/pods/894f7c69-0119-4c19-b205-9780fb52b06e/volumes" Mar 09 09:21:33 crc kubenswrapper[4792]: I0309 09:21:33.873060 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jn5tt"] Mar 09 09:21:33 crc kubenswrapper[4792]: E0309 09:21:33.873351 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894f7c69-0119-4c19-b205-9780fb52b06e" containerName="console" Mar 09 09:21:33 crc kubenswrapper[4792]: I0309 09:21:33.873371 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="894f7c69-0119-4c19-b205-9780fb52b06e" containerName="console" Mar 09 09:21:33 crc kubenswrapper[4792]: I0309 09:21:33.873485 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="894f7c69-0119-4c19-b205-9780fb52b06e" containerName="console" Mar 09 09:21:33 crc kubenswrapper[4792]: I0309 09:21:33.874383 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:33 crc kubenswrapper[4792]: I0309 09:21:33.887850 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jn5tt"] Mar 09 09:21:33 crc kubenswrapper[4792]: I0309 09:21:33.933542 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/996a5916-f3e2-4592-a243-bb84050a16c7-catalog-content\") pod \"redhat-operators-jn5tt\" (UID: \"996a5916-f3e2-4592-a243-bb84050a16c7\") " pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:33 crc kubenswrapper[4792]: I0309 09:21:33.933620 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q468j\" (UniqueName: \"kubernetes.io/projected/996a5916-f3e2-4592-a243-bb84050a16c7-kube-api-access-q468j\") pod \"redhat-operators-jn5tt\" (UID: \"996a5916-f3e2-4592-a243-bb84050a16c7\") " pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:33 crc kubenswrapper[4792]: I0309 09:21:33.933693 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/996a5916-f3e2-4592-a243-bb84050a16c7-utilities\") pod \"redhat-operators-jn5tt\" (UID: \"996a5916-f3e2-4592-a243-bb84050a16c7\") " pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:34 crc kubenswrapper[4792]: I0309 09:21:34.034655 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q468j\" (UniqueName: \"kubernetes.io/projected/996a5916-f3e2-4592-a243-bb84050a16c7-kube-api-access-q468j\") pod \"redhat-operators-jn5tt\" (UID: \"996a5916-f3e2-4592-a243-bb84050a16c7\") " pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:34 crc kubenswrapper[4792]: I0309 09:21:34.034755 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/996a5916-f3e2-4592-a243-bb84050a16c7-utilities\") pod \"redhat-operators-jn5tt\" (UID: \"996a5916-f3e2-4592-a243-bb84050a16c7\") " pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:34 crc kubenswrapper[4792]: I0309 09:21:34.034830 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/996a5916-f3e2-4592-a243-bb84050a16c7-catalog-content\") pod \"redhat-operators-jn5tt\" (UID: \"996a5916-f3e2-4592-a243-bb84050a16c7\") " pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:34 crc kubenswrapper[4792]: I0309 09:21:34.035277 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/996a5916-f3e2-4592-a243-bb84050a16c7-utilities\") pod \"redhat-operators-jn5tt\" (UID: \"996a5916-f3e2-4592-a243-bb84050a16c7\") " pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:34 crc kubenswrapper[4792]: I0309 09:21:34.035760 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/996a5916-f3e2-4592-a243-bb84050a16c7-catalog-content\") pod \"redhat-operators-jn5tt\" (UID: \"996a5916-f3e2-4592-a243-bb84050a16c7\") " pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:34 crc kubenswrapper[4792]: I0309 09:21:34.057827 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q468j\" (UniqueName: \"kubernetes.io/projected/996a5916-f3e2-4592-a243-bb84050a16c7-kube-api-access-q468j\") pod \"redhat-operators-jn5tt\" (UID: \"996a5916-f3e2-4592-a243-bb84050a16c7\") " pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:34 crc kubenswrapper[4792]: I0309 09:21:34.192271 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:34 crc kubenswrapper[4792]: I0309 09:21:34.431081 4792 generic.go:334] "Generic (PLEG): container finished" podID="3419b911-375b-44c5-8be3-074ce9531ac5" containerID="bc519a63758ec224795577d4ee188f4d2b35be4358d146b001bc0387f7704a14" exitCode=0 Mar 09 09:21:34 crc kubenswrapper[4792]: I0309 09:21:34.431124 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" event={"ID":"3419b911-375b-44c5-8be3-074ce9531ac5","Type":"ContainerDied","Data":"bc519a63758ec224795577d4ee188f4d2b35be4358d146b001bc0387f7704a14"} Mar 09 09:21:34 crc kubenswrapper[4792]: I0309 09:21:34.445481 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jn5tt"] Mar 09 09:21:35 crc kubenswrapper[4792]: I0309 09:21:35.437491 4792 generic.go:334] "Generic (PLEG): container finished" podID="996a5916-f3e2-4592-a243-bb84050a16c7" containerID="0fb8bec21976a905be87b2e72682285d83af4c3cd1d8843c325b30290ea8c142" exitCode=0 Mar 09 09:21:35 crc kubenswrapper[4792]: I0309 09:21:35.437563 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn5tt" event={"ID":"996a5916-f3e2-4592-a243-bb84050a16c7","Type":"ContainerDied","Data":"0fb8bec21976a905be87b2e72682285d83af4c3cd1d8843c325b30290ea8c142"} Mar 09 09:21:35 crc kubenswrapper[4792]: I0309 09:21:35.437820 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn5tt" event={"ID":"996a5916-f3e2-4592-a243-bb84050a16c7","Type":"ContainerStarted","Data":"b357ef4af764c44a9860c60c0b8fc385ce24389957d1465f0cbbe24f4f5667bb"} Mar 09 09:21:35 crc kubenswrapper[4792]: I0309 09:21:35.690321 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" Mar 09 09:21:35 crc kubenswrapper[4792]: I0309 09:21:35.760032 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r9z5\" (UniqueName: \"kubernetes.io/projected/3419b911-375b-44c5-8be3-074ce9531ac5-kube-api-access-9r9z5\") pod \"3419b911-375b-44c5-8be3-074ce9531ac5\" (UID: \"3419b911-375b-44c5-8be3-074ce9531ac5\") " Mar 09 09:21:35 crc kubenswrapper[4792]: I0309 09:21:35.760476 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3419b911-375b-44c5-8be3-074ce9531ac5-bundle\") pod \"3419b911-375b-44c5-8be3-074ce9531ac5\" (UID: \"3419b911-375b-44c5-8be3-074ce9531ac5\") " Mar 09 09:21:35 crc kubenswrapper[4792]: I0309 09:21:35.760555 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3419b911-375b-44c5-8be3-074ce9531ac5-util\") pod \"3419b911-375b-44c5-8be3-074ce9531ac5\" (UID: \"3419b911-375b-44c5-8be3-074ce9531ac5\") " Mar 09 09:21:35 crc kubenswrapper[4792]: I0309 09:21:35.761883 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3419b911-375b-44c5-8be3-074ce9531ac5-bundle" (OuterVolumeSpecName: "bundle") pod "3419b911-375b-44c5-8be3-074ce9531ac5" (UID: "3419b911-375b-44c5-8be3-074ce9531ac5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:35 crc kubenswrapper[4792]: I0309 09:21:35.766372 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3419b911-375b-44c5-8be3-074ce9531ac5-kube-api-access-9r9z5" (OuterVolumeSpecName: "kube-api-access-9r9z5") pod "3419b911-375b-44c5-8be3-074ce9531ac5" (UID: "3419b911-375b-44c5-8be3-074ce9531ac5"). InnerVolumeSpecName "kube-api-access-9r9z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:35 crc kubenswrapper[4792]: I0309 09:21:35.774732 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3419b911-375b-44c5-8be3-074ce9531ac5-util" (OuterVolumeSpecName: "util") pod "3419b911-375b-44c5-8be3-074ce9531ac5" (UID: "3419b911-375b-44c5-8be3-074ce9531ac5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:35 crc kubenswrapper[4792]: I0309 09:21:35.862783 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3419b911-375b-44c5-8be3-074ce9531ac5-util\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:35 crc kubenswrapper[4792]: I0309 09:21:35.862849 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r9z5\" (UniqueName: \"kubernetes.io/projected/3419b911-375b-44c5-8be3-074ce9531ac5-kube-api-access-9r9z5\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:35 crc kubenswrapper[4792]: I0309 09:21:35.862864 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3419b911-375b-44c5-8be3-074ce9531ac5-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:36 crc kubenswrapper[4792]: I0309 09:21:36.446389 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn5tt" event={"ID":"996a5916-f3e2-4592-a243-bb84050a16c7","Type":"ContainerStarted","Data":"1c15ac0a645a8709a0f5975a14ed87404c884ba24142610498088a650b4eac01"} Mar 09 09:21:36 crc kubenswrapper[4792]: I0309 09:21:36.449895 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" event={"ID":"3419b911-375b-44c5-8be3-074ce9531ac5","Type":"ContainerDied","Data":"79eaa7a7607afdabba0cdabe458f73edc2a9c7b346bced4e04e1d887398e1fbb"} Mar 09 09:21:36 crc kubenswrapper[4792]: I0309 09:21:36.449939 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79eaa7a7607afdabba0cdabe458f73edc2a9c7b346bced4e04e1d887398e1fbb" Mar 09 09:21:36 crc kubenswrapper[4792]: I0309 09:21:36.450147 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6" Mar 09 09:21:36 crc kubenswrapper[4792]: I0309 09:21:36.843150 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:36 crc kubenswrapper[4792]: I0309 09:21:36.843515 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:36 crc kubenswrapper[4792]: I0309 09:21:36.892502 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:37 crc kubenswrapper[4792]: I0309 09:21:37.456768 4792 generic.go:334] "Generic (PLEG): container finished" podID="996a5916-f3e2-4592-a243-bb84050a16c7" containerID="1c15ac0a645a8709a0f5975a14ed87404c884ba24142610498088a650b4eac01" exitCode=0 Mar 09 09:21:37 crc kubenswrapper[4792]: I0309 09:21:37.456846 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn5tt" event={"ID":"996a5916-f3e2-4592-a243-bb84050a16c7","Type":"ContainerDied","Data":"1c15ac0a645a8709a0f5975a14ed87404c884ba24142610498088a650b4eac01"} Mar 09 09:21:37 crc kubenswrapper[4792]: I0309 09:21:37.505646 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:38 crc kubenswrapper[4792]: I0309 09:21:38.464152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn5tt" event={"ID":"996a5916-f3e2-4592-a243-bb84050a16c7","Type":"ContainerStarted","Data":"1229d80ff5830468b173ad15f1146eb795ac9980ba6a65344043785252ab3b42"} Mar 09 09:21:38 crc kubenswrapper[4792]: I0309 09:21:38.512349 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jn5tt" podStartSLOduration=2.964133207 podStartE2EDuration="5.51233425s" podCreationTimestamp="2026-03-09 09:21:33 +0000 UTC" firstStartedPulling="2026-03-09 09:21:35.43912527 +0000 UTC m=+860.469326022" lastFinishedPulling="2026-03-09 09:21:37.987326313 +0000 UTC m=+863.017527065" observedRunningTime="2026-03-09 09:21:38.509206558 +0000 UTC m=+863.539407310" watchObservedRunningTime="2026-03-09 09:21:38.51233425 +0000 UTC m=+863.542535002" Mar 09 09:21:40 crc kubenswrapper[4792]: I0309 09:21:40.456029 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzgx2"] Mar 09 09:21:40 crc kubenswrapper[4792]: I0309 09:21:40.456262 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jzgx2" podUID="0123630e-dc7f-49f0-9a9f-13146cec7a10" containerName="registry-server" containerID="cri-o://689de895fb0174e98f6bdaca176efff2dbfd4fad885bfa42ca2782238c615bff" gracePeriod=2 Mar 09 09:21:40 crc kubenswrapper[4792]: I0309 09:21:40.848337 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:40 crc kubenswrapper[4792]: I0309 09:21:40.933776 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0123630e-dc7f-49f0-9a9f-13146cec7a10-catalog-content\") pod \"0123630e-dc7f-49f0-9a9f-13146cec7a10\" (UID: \"0123630e-dc7f-49f0-9a9f-13146cec7a10\") " Mar 09 09:21:40 crc kubenswrapper[4792]: I0309 09:21:40.933843 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t48br\" (UniqueName: \"kubernetes.io/projected/0123630e-dc7f-49f0-9a9f-13146cec7a10-kube-api-access-t48br\") pod \"0123630e-dc7f-49f0-9a9f-13146cec7a10\" (UID: \"0123630e-dc7f-49f0-9a9f-13146cec7a10\") " Mar 09 09:21:40 crc kubenswrapper[4792]: I0309 09:21:40.934014 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0123630e-dc7f-49f0-9a9f-13146cec7a10-utilities\") pod \"0123630e-dc7f-49f0-9a9f-13146cec7a10\" (UID: \"0123630e-dc7f-49f0-9a9f-13146cec7a10\") " Mar 09 09:21:40 crc kubenswrapper[4792]: I0309 09:21:40.935413 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0123630e-dc7f-49f0-9a9f-13146cec7a10-utilities" (OuterVolumeSpecName: "utilities") pod "0123630e-dc7f-49f0-9a9f-13146cec7a10" (UID: "0123630e-dc7f-49f0-9a9f-13146cec7a10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:40 crc kubenswrapper[4792]: I0309 09:21:40.940086 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0123630e-dc7f-49f0-9a9f-13146cec7a10-kube-api-access-t48br" (OuterVolumeSpecName: "kube-api-access-t48br") pod "0123630e-dc7f-49f0-9a9f-13146cec7a10" (UID: "0123630e-dc7f-49f0-9a9f-13146cec7a10"). InnerVolumeSpecName "kube-api-access-t48br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:40 crc kubenswrapper[4792]: I0309 09:21:40.968094 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0123630e-dc7f-49f0-9a9f-13146cec7a10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0123630e-dc7f-49f0-9a9f-13146cec7a10" (UID: "0123630e-dc7f-49f0-9a9f-13146cec7a10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.034995 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0123630e-dc7f-49f0-9a9f-13146cec7a10-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.035301 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0123630e-dc7f-49f0-9a9f-13146cec7a10-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.035366 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t48br\" (UniqueName: \"kubernetes.io/projected/0123630e-dc7f-49f0-9a9f-13146cec7a10-kube-api-access-t48br\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.481653 4792 generic.go:334] "Generic (PLEG): container finished" podID="0123630e-dc7f-49f0-9a9f-13146cec7a10" containerID="689de895fb0174e98f6bdaca176efff2dbfd4fad885bfa42ca2782238c615bff" exitCode=0 Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.481698 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzgx2" event={"ID":"0123630e-dc7f-49f0-9a9f-13146cec7a10","Type":"ContainerDied","Data":"689de895fb0174e98f6bdaca176efff2dbfd4fad885bfa42ca2782238c615bff"} Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.481722 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jzgx2" event={"ID":"0123630e-dc7f-49f0-9a9f-13146cec7a10","Type":"ContainerDied","Data":"dac366d724d42295fe9b806c4857e3a87d653b50131cb795dbc1444292d68e2a"} Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.481732 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jzgx2" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.481741 4792 scope.go:117] "RemoveContainer" containerID="689de895fb0174e98f6bdaca176efff2dbfd4fad885bfa42ca2782238c615bff" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.499647 4792 scope.go:117] "RemoveContainer" containerID="fba5d746920aecf244c35d7a65104514b0bde8667d1b8ae66e2366866dffc130" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.519304 4792 scope.go:117] "RemoveContainer" containerID="2e942a81dbad387994bffbdb98a57a643a343fd6e182048d3e09dfd5e4cba7db" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.521768 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzgx2"] Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.537872 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jzgx2"] Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.551138 4792 scope.go:117] "RemoveContainer" containerID="689de895fb0174e98f6bdaca176efff2dbfd4fad885bfa42ca2782238c615bff" Mar 09 09:21:41 crc kubenswrapper[4792]: E0309 09:21:41.551629 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"689de895fb0174e98f6bdaca176efff2dbfd4fad885bfa42ca2782238c615bff\": container with ID starting with 689de895fb0174e98f6bdaca176efff2dbfd4fad885bfa42ca2782238c615bff not found: ID does not exist" containerID="689de895fb0174e98f6bdaca176efff2dbfd4fad885bfa42ca2782238c615bff" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.551676 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689de895fb0174e98f6bdaca176efff2dbfd4fad885bfa42ca2782238c615bff"} err="failed to get container status \"689de895fb0174e98f6bdaca176efff2dbfd4fad885bfa42ca2782238c615bff\": rpc error: code = NotFound desc = could not find container \"689de895fb0174e98f6bdaca176efff2dbfd4fad885bfa42ca2782238c615bff\": container with ID starting with 689de895fb0174e98f6bdaca176efff2dbfd4fad885bfa42ca2782238c615bff not found: ID does not exist" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.551709 4792 scope.go:117] "RemoveContainer" containerID="fba5d746920aecf244c35d7a65104514b0bde8667d1b8ae66e2366866dffc130" Mar 09 09:21:41 crc kubenswrapper[4792]: E0309 09:21:41.552059 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fba5d746920aecf244c35d7a65104514b0bde8667d1b8ae66e2366866dffc130\": container with ID starting with fba5d746920aecf244c35d7a65104514b0bde8667d1b8ae66e2366866dffc130 not found: ID does not exist" containerID="fba5d746920aecf244c35d7a65104514b0bde8667d1b8ae66e2366866dffc130" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.552117 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba5d746920aecf244c35d7a65104514b0bde8667d1b8ae66e2366866dffc130"} err="failed to get container status \"fba5d746920aecf244c35d7a65104514b0bde8667d1b8ae66e2366866dffc130\": rpc error: code = NotFound desc = could not find container \"fba5d746920aecf244c35d7a65104514b0bde8667d1b8ae66e2366866dffc130\": container with ID starting with fba5d746920aecf244c35d7a65104514b0bde8667d1b8ae66e2366866dffc130 not found: ID does not exist" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.552146 4792 scope.go:117] "RemoveContainer" containerID="2e942a81dbad387994bffbdb98a57a643a343fd6e182048d3e09dfd5e4cba7db" Mar 09 09:21:41 crc kubenswrapper[4792]: E0309 09:21:41.552424 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e942a81dbad387994bffbdb98a57a643a343fd6e182048d3e09dfd5e4cba7db\": container with ID starting with 2e942a81dbad387994bffbdb98a57a643a343fd6e182048d3e09dfd5e4cba7db not found: ID does not exist" containerID="2e942a81dbad387994bffbdb98a57a643a343fd6e182048d3e09dfd5e4cba7db" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.552458 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e942a81dbad387994bffbdb98a57a643a343fd6e182048d3e09dfd5e4cba7db"} err="failed to get container status \"2e942a81dbad387994bffbdb98a57a643a343fd6e182048d3e09dfd5e4cba7db\": rpc error: code = NotFound desc = could not find container \"2e942a81dbad387994bffbdb98a57a643a343fd6e182048d3e09dfd5e4cba7db\": container with ID starting with 2e942a81dbad387994bffbdb98a57a643a343fd6e182048d3e09dfd5e4cba7db not found: ID does not exist" Mar 09 09:21:41 crc kubenswrapper[4792]: I0309 09:21:41.674156 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0123630e-dc7f-49f0-9a9f-13146cec7a10" path="/var/lib/kubelet/pods/0123630e-dc7f-49f0-9a9f-13146cec7a10/volumes" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.192498 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.192785 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.552713 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds"] Mar 09 09:21:44 crc kubenswrapper[4792]: E0309 09:21:44.552974 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0123630e-dc7f-49f0-9a9f-13146cec7a10" containerName="extract-utilities" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.552998 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0123630e-dc7f-49f0-9a9f-13146cec7a10" containerName="extract-utilities" Mar 09 09:21:44 crc kubenswrapper[4792]: E0309 09:21:44.553018 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3419b911-375b-44c5-8be3-074ce9531ac5" containerName="util" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.553027 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3419b911-375b-44c5-8be3-074ce9531ac5" containerName="util" Mar 09 09:21:44 crc kubenswrapper[4792]: E0309 09:21:44.553042 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0123630e-dc7f-49f0-9a9f-13146cec7a10" containerName="extract-content" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.553051 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0123630e-dc7f-49f0-9a9f-13146cec7a10" containerName="extract-content" Mar 09 09:21:44 crc kubenswrapper[4792]: E0309 09:21:44.553064 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3419b911-375b-44c5-8be3-074ce9531ac5" containerName="extract" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.553088 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3419b911-375b-44c5-8be3-074ce9531ac5" containerName="extract" Mar 09 09:21:44 crc kubenswrapper[4792]: E0309 09:21:44.553104 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3419b911-375b-44c5-8be3-074ce9531ac5" containerName="pull" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.553111 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3419b911-375b-44c5-8be3-074ce9531ac5" containerName="pull" Mar 09 09:21:44 crc kubenswrapper[4792]: E0309 09:21:44.553122 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0123630e-dc7f-49f0-9a9f-13146cec7a10" containerName="registry-server" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.553131 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0123630e-dc7f-49f0-9a9f-13146cec7a10" containerName="registry-server" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.553259 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0123630e-dc7f-49f0-9a9f-13146cec7a10" containerName="registry-server" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.553274 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3419b911-375b-44c5-8be3-074ce9531ac5" containerName="extract" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.553741 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.556441 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.556529 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.556773 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.556987 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.557149 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-8rq5b" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.577014 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds"] Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.709323 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5cb925f9-fcd8-47a5-8959-76bfdbbc2979-webhook-cert\") pod \"metallb-operator-controller-manager-5d5f56c665-gxjds\" (UID: \"5cb925f9-fcd8-47a5-8959-76bfdbbc2979\") " pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.709672 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2pxz\" (UniqueName: \"kubernetes.io/projected/5cb925f9-fcd8-47a5-8959-76bfdbbc2979-kube-api-access-w2pxz\") pod \"metallb-operator-controller-manager-5d5f56c665-gxjds\" (UID: \"5cb925f9-fcd8-47a5-8959-76bfdbbc2979\") " pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.709696 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5cb925f9-fcd8-47a5-8959-76bfdbbc2979-apiservice-cert\") pod \"metallb-operator-controller-manager-5d5f56c665-gxjds\" (UID: \"5cb925f9-fcd8-47a5-8959-76bfdbbc2979\") " pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.810537 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2pxz\" (UniqueName: \"kubernetes.io/projected/5cb925f9-fcd8-47a5-8959-76bfdbbc2979-kube-api-access-w2pxz\") pod \"metallb-operator-controller-manager-5d5f56c665-gxjds\" (UID: \"5cb925f9-fcd8-47a5-8959-76bfdbbc2979\") " pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.810597 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5cb925f9-fcd8-47a5-8959-76bfdbbc2979-apiservice-cert\") pod \"metallb-operator-controller-manager-5d5f56c665-gxjds\" (UID: \"5cb925f9-fcd8-47a5-8959-76bfdbbc2979\") " pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.810745 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5cb925f9-fcd8-47a5-8959-76bfdbbc2979-webhook-cert\") pod \"metallb-operator-controller-manager-5d5f56c665-gxjds\" (UID: \"5cb925f9-fcd8-47a5-8959-76bfdbbc2979\") " pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.817525 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5cb925f9-fcd8-47a5-8959-76bfdbbc2979-apiservice-cert\") pod \"metallb-operator-controller-manager-5d5f56c665-gxjds\" (UID: \"5cb925f9-fcd8-47a5-8959-76bfdbbc2979\") " pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.821234 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57"] Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.822089 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.825613 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-lg4n7" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.825820 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.825923 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.837482 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5cb925f9-fcd8-47a5-8959-76bfdbbc2979-webhook-cert\") pod \"metallb-operator-controller-manager-5d5f56c665-gxjds\" (UID: \"5cb925f9-fcd8-47a5-8959-76bfdbbc2979\") " pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.850009 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2pxz\" (UniqueName: \"kubernetes.io/projected/5cb925f9-fcd8-47a5-8959-76bfdbbc2979-kube-api-access-w2pxz\") pod \"metallb-operator-controller-manager-5d5f56c665-gxjds\" (UID: \"5cb925f9-fcd8-47a5-8959-76bfdbbc2979\") " pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.872173 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.911963 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69j4k\" (UniqueName: \"kubernetes.io/projected/491ea032-e688-454c-a67d-09966007bb7f-kube-api-access-69j4k\") pod \"metallb-operator-webhook-server-6d4cf89d46-x6c57\" (UID: \"491ea032-e688-454c-a67d-09966007bb7f\") " pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.912099 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/491ea032-e688-454c-a67d-09966007bb7f-webhook-cert\") pod \"metallb-operator-webhook-server-6d4cf89d46-x6c57\" (UID: \"491ea032-e688-454c-a67d-09966007bb7f\") " pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.912168 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/491ea032-e688-454c-a67d-09966007bb7f-apiservice-cert\") pod \"metallb-operator-webhook-server-6d4cf89d46-x6c57\" (UID: \"491ea032-e688-454c-a67d-09966007bb7f\") " pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" Mar 09 09:21:44 crc kubenswrapper[4792]: I0309 09:21:44.963081 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57"] Mar 09 09:21:45 crc kubenswrapper[4792]: I0309 09:21:45.014666 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69j4k\" (UniqueName: \"kubernetes.io/projected/491ea032-e688-454c-a67d-09966007bb7f-kube-api-access-69j4k\") pod \"metallb-operator-webhook-server-6d4cf89d46-x6c57\" (UID: \"491ea032-e688-454c-a67d-09966007bb7f\") " pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" Mar 09 09:21:45 crc kubenswrapper[4792]: I0309 09:21:45.014727 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/491ea032-e688-454c-a67d-09966007bb7f-webhook-cert\") pod \"metallb-operator-webhook-server-6d4cf89d46-x6c57\" (UID: \"491ea032-e688-454c-a67d-09966007bb7f\") " pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" Mar 09 09:21:45 crc kubenswrapper[4792]: I0309 09:21:45.014751 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/491ea032-e688-454c-a67d-09966007bb7f-apiservice-cert\") pod \"metallb-operator-webhook-server-6d4cf89d46-x6c57\" (UID: \"491ea032-e688-454c-a67d-09966007bb7f\") " pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" Mar 09 09:21:45 crc kubenswrapper[4792]: I0309 09:21:45.020592 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/491ea032-e688-454c-a67d-09966007bb7f-apiservice-cert\") pod \"metallb-operator-webhook-server-6d4cf89d46-x6c57\" (UID: \"491ea032-e688-454c-a67d-09966007bb7f\") " pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" Mar 09 09:21:45 crc kubenswrapper[4792]: I0309 09:21:45.021715 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/491ea032-e688-454c-a67d-09966007bb7f-webhook-cert\") pod \"metallb-operator-webhook-server-6d4cf89d46-x6c57\" (UID: \"491ea032-e688-454c-a67d-09966007bb7f\") " pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" Mar 09 09:21:45 crc kubenswrapper[4792]: I0309 09:21:45.046430 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69j4k\" (UniqueName: \"kubernetes.io/projected/491ea032-e688-454c-a67d-09966007bb7f-kube-api-access-69j4k\") pod \"metallb-operator-webhook-server-6d4cf89d46-x6c57\" (UID: \"491ea032-e688-454c-a67d-09966007bb7f\") " pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" Mar 09 09:21:45 crc kubenswrapper[4792]: I0309 09:21:45.234605 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" Mar 09 09:21:45 crc kubenswrapper[4792]: I0309 09:21:45.261951 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jn5tt" podUID="996a5916-f3e2-4592-a243-bb84050a16c7" containerName="registry-server" probeResult="failure" output=< Mar 09 09:21:45 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 09:21:45 crc kubenswrapper[4792]: > Mar 09 09:21:45 crc kubenswrapper[4792]: I0309 09:21:45.352341 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds"] Mar 09 09:21:45 crc kubenswrapper[4792]: W0309 09:21:45.368579 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cb925f9_fcd8_47a5_8959_76bfdbbc2979.slice/crio-a82e5416065d06999034b9a5769a7719ec1b4978a345d6793898c6e60040e573 WatchSource:0}: Error finding container a82e5416065d06999034b9a5769a7719ec1b4978a345d6793898c6e60040e573: Status 404 returned error can't find the container with id a82e5416065d06999034b9a5769a7719ec1b4978a345d6793898c6e60040e573 Mar 09 09:21:45 crc kubenswrapper[4792]: I0309 09:21:45.509567 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" event={"ID":"5cb925f9-fcd8-47a5-8959-76bfdbbc2979","Type":"ContainerStarted","Data":"a82e5416065d06999034b9a5769a7719ec1b4978a345d6793898c6e60040e573"} Mar 09 09:21:45 crc kubenswrapper[4792]: W0309 09:21:45.749311 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod491ea032_e688_454c_a67d_09966007bb7f.slice/crio-7e68e0775e2c719fc9d117e39c57cc704dedf476e1bee48d7dc8657e8ecea25d WatchSource:0}: Error finding container 7e68e0775e2c719fc9d117e39c57cc704dedf476e1bee48d7dc8657e8ecea25d: Status 404 returned error can't find the container with id 7e68e0775e2c719fc9d117e39c57cc704dedf476e1bee48d7dc8657e8ecea25d Mar 09 09:21:45 crc kubenswrapper[4792]: I0309 09:21:45.751820 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57"] Mar 09 09:21:46 crc kubenswrapper[4792]: I0309 09:21:46.515032 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" event={"ID":"491ea032-e688-454c-a67d-09966007bb7f","Type":"ContainerStarted","Data":"7e68e0775e2c719fc9d117e39c57cc704dedf476e1bee48d7dc8657e8ecea25d"} Mar 09 09:21:49 crc kubenswrapper[4792]: I0309 09:21:49.535166 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" event={"ID":"5cb925f9-fcd8-47a5-8959-76bfdbbc2979","Type":"ContainerStarted","Data":"45a96e2f0597065452a28f31b9e319b3a6c1e12241f9a3f634d5f6d155e15326"} Mar 09 09:21:49 crc kubenswrapper[4792]: I0309 09:21:49.535615 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" Mar 09 09:21:49 crc kubenswrapper[4792]: I0309 09:21:49.565253 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" podStartSLOduration=1.816372334 podStartE2EDuration="5.565228394s" podCreationTimestamp="2026-03-09 09:21:44 +0000 UTC" firstStartedPulling="2026-03-09 09:21:45.371033068 +0000 UTC m=+870.401233820" lastFinishedPulling="2026-03-09 09:21:49.119889128 +0000 UTC m=+874.150089880" observedRunningTime="2026-03-09 09:21:49.560410003 +0000 UTC m=+874.590610775" watchObservedRunningTime="2026-03-09 09:21:49.565228394 +0000 UTC m=+874.595429156" Mar 09 09:21:51 crc kubenswrapper[4792]: I0309 09:21:51.547327 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" event={"ID":"491ea032-e688-454c-a67d-09966007bb7f","Type":"ContainerStarted","Data":"c741e7a6552fe8e93c2c9880ce5888e0db6f8811a1e1f0f55542abd0f69c93fb"} Mar 09 09:21:51 crc kubenswrapper[4792]: I0309 09:21:51.547649 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" Mar 09 09:21:51 crc kubenswrapper[4792]: I0309 09:21:51.567328 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" podStartSLOduration=2.455649092 podStartE2EDuration="7.567310729s" podCreationTimestamp="2026-03-09 09:21:44 +0000 UTC" firstStartedPulling="2026-03-09 09:21:45.754677905 +0000 UTC m=+870.784878657" lastFinishedPulling="2026-03-09 09:21:50.866339542 +0000 UTC m=+875.896540294" observedRunningTime="2026-03-09 09:21:51.566141674 +0000 UTC m=+876.596342426" watchObservedRunningTime="2026-03-09 09:21:51.567310729 +0000 UTC m=+876.597511481" Mar 09 09:21:54 crc kubenswrapper[4792]: I0309 09:21:54.231818 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:54 crc kubenswrapper[4792]: I0309 09:21:54.280028 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:54 crc kubenswrapper[4792]: I0309 09:21:54.465158 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jn5tt"] Mar 09 09:21:55 crc kubenswrapper[4792]: I0309 09:21:55.571364 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jn5tt" podUID="996a5916-f3e2-4592-a243-bb84050a16c7" containerName="registry-server" containerID="cri-o://1229d80ff5830468b173ad15f1146eb795ac9980ba6a65344043785252ab3b42" gracePeriod=2 Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.070831 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.174162 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/996a5916-f3e2-4592-a243-bb84050a16c7-catalog-content\") pod \"996a5916-f3e2-4592-a243-bb84050a16c7\" (UID: \"996a5916-f3e2-4592-a243-bb84050a16c7\") " Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.174261 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/996a5916-f3e2-4592-a243-bb84050a16c7-utilities\") pod \"996a5916-f3e2-4592-a243-bb84050a16c7\" (UID: \"996a5916-f3e2-4592-a243-bb84050a16c7\") " Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.174294 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q468j\" (UniqueName: \"kubernetes.io/projected/996a5916-f3e2-4592-a243-bb84050a16c7-kube-api-access-q468j\") pod \"996a5916-f3e2-4592-a243-bb84050a16c7\" (UID: \"996a5916-f3e2-4592-a243-bb84050a16c7\") " Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.175331 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/996a5916-f3e2-4592-a243-bb84050a16c7-utilities" (OuterVolumeSpecName: "utilities") pod "996a5916-f3e2-4592-a243-bb84050a16c7" (UID: "996a5916-f3e2-4592-a243-bb84050a16c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.190350 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/996a5916-f3e2-4592-a243-bb84050a16c7-kube-api-access-q468j" (OuterVolumeSpecName: "kube-api-access-q468j") pod "996a5916-f3e2-4592-a243-bb84050a16c7" (UID: "996a5916-f3e2-4592-a243-bb84050a16c7"). InnerVolumeSpecName "kube-api-access-q468j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.278234 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/996a5916-f3e2-4592-a243-bb84050a16c7-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.278289 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q468j\" (UniqueName: \"kubernetes.io/projected/996a5916-f3e2-4592-a243-bb84050a16c7-kube-api-access-q468j\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.305926 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/996a5916-f3e2-4592-a243-bb84050a16c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "996a5916-f3e2-4592-a243-bb84050a16c7" (UID: "996a5916-f3e2-4592-a243-bb84050a16c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.380089 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/996a5916-f3e2-4592-a243-bb84050a16c7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.578605 4792 generic.go:334] "Generic (PLEG): container finished" podID="996a5916-f3e2-4592-a243-bb84050a16c7" containerID="1229d80ff5830468b173ad15f1146eb795ac9980ba6a65344043785252ab3b42" exitCode=0 Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.578647 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn5tt" event={"ID":"996a5916-f3e2-4592-a243-bb84050a16c7","Type":"ContainerDied","Data":"1229d80ff5830468b173ad15f1146eb795ac9980ba6a65344043785252ab3b42"} Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.578652 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jn5tt" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.578678 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jn5tt" event={"ID":"996a5916-f3e2-4592-a243-bb84050a16c7","Type":"ContainerDied","Data":"b357ef4af764c44a9860c60c0b8fc385ce24389957d1465f0cbbe24f4f5667bb"} Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.578696 4792 scope.go:117] "RemoveContainer" containerID="1229d80ff5830468b173ad15f1146eb795ac9980ba6a65344043785252ab3b42" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.608461 4792 scope.go:117] "RemoveContainer" containerID="1c15ac0a645a8709a0f5975a14ed87404c884ba24142610498088a650b4eac01" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.621664 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jn5tt"] Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.656009 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jn5tt"] Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.660811 4792 scope.go:117] "RemoveContainer" containerID="0fb8bec21976a905be87b2e72682285d83af4c3cd1d8843c325b30290ea8c142" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.680319 4792 scope.go:117] "RemoveContainer" containerID="1229d80ff5830468b173ad15f1146eb795ac9980ba6a65344043785252ab3b42" Mar 09 09:21:56 crc kubenswrapper[4792]: E0309 09:21:56.680844 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1229d80ff5830468b173ad15f1146eb795ac9980ba6a65344043785252ab3b42\": container with ID starting with 1229d80ff5830468b173ad15f1146eb795ac9980ba6a65344043785252ab3b42 not found: ID does not exist" containerID="1229d80ff5830468b173ad15f1146eb795ac9980ba6a65344043785252ab3b42" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.680943 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1229d80ff5830468b173ad15f1146eb795ac9980ba6a65344043785252ab3b42"} err="failed to get container status \"1229d80ff5830468b173ad15f1146eb795ac9980ba6a65344043785252ab3b42\": rpc error: code = NotFound desc = could not find container \"1229d80ff5830468b173ad15f1146eb795ac9980ba6a65344043785252ab3b42\": container with ID starting with 1229d80ff5830468b173ad15f1146eb795ac9980ba6a65344043785252ab3b42 not found: ID does not exist" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.681017 4792 scope.go:117] "RemoveContainer" containerID="1c15ac0a645a8709a0f5975a14ed87404c884ba24142610498088a650b4eac01" Mar 09 09:21:56 crc kubenswrapper[4792]: E0309 09:21:56.681552 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c15ac0a645a8709a0f5975a14ed87404c884ba24142610498088a650b4eac01\": container with ID starting with 1c15ac0a645a8709a0f5975a14ed87404c884ba24142610498088a650b4eac01 not found: ID does not exist" containerID="1c15ac0a645a8709a0f5975a14ed87404c884ba24142610498088a650b4eac01" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.681662 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c15ac0a645a8709a0f5975a14ed87404c884ba24142610498088a650b4eac01"} err="failed to get container status \"1c15ac0a645a8709a0f5975a14ed87404c884ba24142610498088a650b4eac01\": rpc error: code = NotFound desc = could not find container \"1c15ac0a645a8709a0f5975a14ed87404c884ba24142610498088a650b4eac01\": container with ID starting with 1c15ac0a645a8709a0f5975a14ed87404c884ba24142610498088a650b4eac01 not found: ID does not exist" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.681737 4792 scope.go:117] "RemoveContainer" containerID="0fb8bec21976a905be87b2e72682285d83af4c3cd1d8843c325b30290ea8c142" Mar 09 09:21:56 crc kubenswrapper[4792]: E0309 09:21:56.682034 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fb8bec21976a905be87b2e72682285d83af4c3cd1d8843c325b30290ea8c142\": container with ID starting with 0fb8bec21976a905be87b2e72682285d83af4c3cd1d8843c325b30290ea8c142 not found: ID does not exist" containerID="0fb8bec21976a905be87b2e72682285d83af4c3cd1d8843c325b30290ea8c142" Mar 09 09:21:56 crc kubenswrapper[4792]: I0309 09:21:56.682061 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb8bec21976a905be87b2e72682285d83af4c3cd1d8843c325b30290ea8c142"} err="failed to get container status \"0fb8bec21976a905be87b2e72682285d83af4c3cd1d8843c325b30290ea8c142\": rpc error: code = NotFound desc = could not find container \"0fb8bec21976a905be87b2e72682285d83af4c3cd1d8843c325b30290ea8c142\": container with ID starting with 0fb8bec21976a905be87b2e72682285d83af4c3cd1d8843c325b30290ea8c142 not found: ID does not exist" Mar 09 09:21:57 crc kubenswrapper[4792]: I0309 09:21:57.670586 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="996a5916-f3e2-4592-a243-bb84050a16c7" path="/var/lib/kubelet/pods/996a5916-f3e2-4592-a243-bb84050a16c7/volumes" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.134069 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550802-wths8"] Mar 09 09:22:00 crc kubenswrapper[4792]: E0309 09:22:00.134333 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996a5916-f3e2-4592-a243-bb84050a16c7" containerName="registry-server" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.134346 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="996a5916-f3e2-4592-a243-bb84050a16c7" containerName="registry-server" Mar 09 09:22:00 crc kubenswrapper[4792]: E0309 09:22:00.134360 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996a5916-f3e2-4592-a243-bb84050a16c7" containerName="extract-content" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.134366 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="996a5916-f3e2-4592-a243-bb84050a16c7" containerName="extract-content" Mar 09 09:22:00 crc kubenswrapper[4792]: E0309 09:22:00.134375 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996a5916-f3e2-4592-a243-bb84050a16c7" containerName="extract-utilities" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.134381 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="996a5916-f3e2-4592-a243-bb84050a16c7" containerName="extract-utilities" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.134480 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="996a5916-f3e2-4592-a243-bb84050a16c7" containerName="registry-server" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.134859 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550802-wths8" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.137809 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.138210 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.138437 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.149608 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550802-wths8"] Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.232905 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96qtj\" (UniqueName: \"kubernetes.io/projected/b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b-kube-api-access-96qtj\") pod \"auto-csr-approver-29550802-wths8\" (UID: \"b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b\") " pod="openshift-infra/auto-csr-approver-29550802-wths8" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.334775 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96qtj\" (UniqueName: \"kubernetes.io/projected/b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b-kube-api-access-96qtj\") pod \"auto-csr-approver-29550802-wths8\" (UID: \"b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b\") " pod="openshift-infra/auto-csr-approver-29550802-wths8" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.364390 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96qtj\" (UniqueName: \"kubernetes.io/projected/b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b-kube-api-access-96qtj\") pod \"auto-csr-approver-29550802-wths8\" (UID: \"b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b\") " pod="openshift-infra/auto-csr-approver-29550802-wths8" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.458359 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550802-wths8" Mar 09 09:22:00 crc kubenswrapper[4792]: I0309 09:22:00.792245 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550802-wths8"] Mar 09 09:22:01 crc kubenswrapper[4792]: I0309 09:22:01.614055 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550802-wths8" event={"ID":"b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b","Type":"ContainerStarted","Data":"780f6f14697dd211f46de5cef0a261bbc3b80fff7d957eec6bafdd306ee9768c"} Mar 09 09:22:02 crc kubenswrapper[4792]: I0309 09:22:02.621143 4792 generic.go:334] "Generic (PLEG): container finished" podID="b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b" containerID="398445458e5f6ed76028563a72dfed72bde3be9109ed7bca5f9787a23837f624" exitCode=0 Mar 09 09:22:02 crc kubenswrapper[4792]: I0309 09:22:02.621235 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550802-wths8" event={"ID":"b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b","Type":"ContainerDied","Data":"398445458e5f6ed76028563a72dfed72bde3be9109ed7bca5f9787a23837f624"} Mar 09 09:22:03 crc kubenswrapper[4792]: I0309 09:22:03.902107 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550802-wths8" Mar 09 09:22:04 crc kubenswrapper[4792]: I0309 09:22:04.085287 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96qtj\" (UniqueName: \"kubernetes.io/projected/b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b-kube-api-access-96qtj\") pod \"b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b\" (UID: \"b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b\") " Mar 09 09:22:04 crc kubenswrapper[4792]: I0309 09:22:04.091355 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b-kube-api-access-96qtj" (OuterVolumeSpecName: "kube-api-access-96qtj") pod "b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b" (UID: "b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b"). InnerVolumeSpecName "kube-api-access-96qtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:04 crc kubenswrapper[4792]: I0309 09:22:04.187327 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96qtj\" (UniqueName: \"kubernetes.io/projected/b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b-kube-api-access-96qtj\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:04 crc kubenswrapper[4792]: I0309 09:22:04.635061 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550802-wths8" event={"ID":"b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b","Type":"ContainerDied","Data":"780f6f14697dd211f46de5cef0a261bbc3b80fff7d957eec6bafdd306ee9768c"} Mar 09 09:22:04 crc kubenswrapper[4792]: I0309 09:22:04.635521 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="780f6f14697dd211f46de5cef0a261bbc3b80fff7d957eec6bafdd306ee9768c" Mar 09 09:22:04 crc kubenswrapper[4792]: I0309 09:22:04.635463 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550802-wths8" Mar 09 09:22:04 crc kubenswrapper[4792]: I0309 09:22:04.951698 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550796-kjk86"] Mar 09 09:22:04 crc kubenswrapper[4792]: I0309 09:22:04.959247 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550796-kjk86"] Mar 09 09:22:05 crc kubenswrapper[4792]: I0309 09:22:05.241762 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6d4cf89d46-x6c57" Mar 09 09:22:05 crc kubenswrapper[4792]: I0309 09:22:05.670735 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92" path="/var/lib/kubelet/pods/6c1b19ca-82b0-4ec9-9ef1-3cd8d0c97e92/volumes" Mar 09 09:22:24 crc kubenswrapper[4792]: I0309 09:22:24.874550 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5d5f56c665-gxjds" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.572156 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-rll29"] Mar 09 09:22:25 crc kubenswrapper[4792]: E0309 09:22:25.572399 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b" containerName="oc" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.572411 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b" containerName="oc" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.572518 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b" containerName="oc" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.574175 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.578980 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.579015 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.579942 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-nnnhj" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.629582 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm"] Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.641178 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.642988 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.653976 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm"] Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.721366 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-9fj2f"] Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.722451 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-9fj2f" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.725061 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.725532 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.729716 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.729901 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-mqrg6" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.741250 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-2lfdv"] Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.742391 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.747625 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.768267 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-metallb-excludel2\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.768637 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/10a41b58-f88e-4dad-960f-cd70b006c3e7-frr-startup\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.768738 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98a5f3b6-5d33-4542-9382-ea1d94e5f59f-cert\") pod \"controller-86ddb6bd46-2lfdv\" (UID: \"98a5f3b6-5d33-4542-9382-ea1d94e5f59f\") " pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.768822 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/10a41b58-f88e-4dad-960f-cd70b006c3e7-metrics\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.768904 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10a41b58-f88e-4dad-960f-cd70b006c3e7-metrics-certs\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.769004 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-metrics-certs\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.769106 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98a5f3b6-5d33-4542-9382-ea1d94e5f59f-metrics-certs\") pod \"controller-86ddb6bd46-2lfdv\" (UID: \"98a5f3b6-5d33-4542-9382-ea1d94e5f59f\") " pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.769193 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/10a41b58-f88e-4dad-960f-cd70b006c3e7-reloader\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.769286 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/10a41b58-f88e-4dad-960f-cd70b006c3e7-frr-conf\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.769379 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ac24411-eccd-496a-8a49-d9b552a92691-cert\") pod \"frr-k8s-webhook-server-7f989f654f-gm7zm\" (UID: \"7ac24411-eccd-496a-8a49-d9b552a92691\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.769444 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z67th\" (UniqueName: \"kubernetes.io/projected/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-kube-api-access-z67th\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.769515 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8dj5\" (UniqueName: \"kubernetes.io/projected/7ac24411-eccd-496a-8a49-d9b552a92691-kube-api-access-z8dj5\") pod \"frr-k8s-webhook-server-7f989f654f-gm7zm\" (UID: \"7ac24411-eccd-496a-8a49-d9b552a92691\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.769586 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-memberlist\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.769665 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/10a41b58-f88e-4dad-960f-cd70b006c3e7-frr-sockets\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.769744 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5n9g\" (UniqueName: \"kubernetes.io/projected/98a5f3b6-5d33-4542-9382-ea1d94e5f59f-kube-api-access-b5n9g\") pod \"controller-86ddb6bd46-2lfdv\" (UID: \"98a5f3b6-5d33-4542-9382-ea1d94e5f59f\") " pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.769813 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjdfg\" (UniqueName: \"kubernetes.io/projected/10a41b58-f88e-4dad-960f-cd70b006c3e7-kube-api-access-qjdfg\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.775176 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-2lfdv"] Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871418 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/10a41b58-f88e-4dad-960f-cd70b006c3e7-frr-conf\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ac24411-eccd-496a-8a49-d9b552a92691-cert\") pod \"frr-k8s-webhook-server-7f989f654f-gm7zm\" (UID: \"7ac24411-eccd-496a-8a49-d9b552a92691\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871518 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z67th\" (UniqueName: \"kubernetes.io/projected/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-kube-api-access-z67th\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871541 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8dj5\" (UniqueName: \"kubernetes.io/projected/7ac24411-eccd-496a-8a49-d9b552a92691-kube-api-access-z8dj5\") pod \"frr-k8s-webhook-server-7f989f654f-gm7zm\" (UID: \"7ac24411-eccd-496a-8a49-d9b552a92691\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871566 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-memberlist\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871596 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/10a41b58-f88e-4dad-960f-cd70b006c3e7-frr-sockets\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871631 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5n9g\" (UniqueName: \"kubernetes.io/projected/98a5f3b6-5d33-4542-9382-ea1d94e5f59f-kube-api-access-b5n9g\") pod \"controller-86ddb6bd46-2lfdv\" (UID: \"98a5f3b6-5d33-4542-9382-ea1d94e5f59f\") " pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871656 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjdfg\" (UniqueName: \"kubernetes.io/projected/10a41b58-f88e-4dad-960f-cd70b006c3e7-kube-api-access-qjdfg\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871706 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-metallb-excludel2\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871730 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/10a41b58-f88e-4dad-960f-cd70b006c3e7-frr-startup\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: E0309 09:22:25.871733 4792 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871757 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98a5f3b6-5d33-4542-9382-ea1d94e5f59f-cert\") pod \"controller-86ddb6bd46-2lfdv\" (UID: \"98a5f3b6-5d33-4542-9382-ea1d94e5f59f\") " pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871781 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/10a41b58-f88e-4dad-960f-cd70b006c3e7-metrics\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: E0309 09:22:25.871812 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ac24411-eccd-496a-8a49-d9b552a92691-cert podName:7ac24411-eccd-496a-8a49-d9b552a92691 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:26.371790782 +0000 UTC m=+911.401991534 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7ac24411-eccd-496a-8a49-d9b552a92691-cert") pod "frr-k8s-webhook-server-7f989f654f-gm7zm" (UID: "7ac24411-eccd-496a-8a49-d9b552a92691") : secret "frr-k8s-webhook-server-cert" not found Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871839 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10a41b58-f88e-4dad-960f-cd70b006c3e7-metrics-certs\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871880 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-metrics-certs\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871912 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98a5f3b6-5d33-4542-9382-ea1d94e5f59f-metrics-certs\") pod \"controller-86ddb6bd46-2lfdv\" (UID: \"98a5f3b6-5d33-4542-9382-ea1d94e5f59f\") " pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.871943 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/10a41b58-f88e-4dad-960f-cd70b006c3e7-reloader\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.872298 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/10a41b58-f88e-4dad-960f-cd70b006c3e7-metrics\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.872371 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/10a41b58-f88e-4dad-960f-cd70b006c3e7-reloader\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.872575 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/10a41b58-f88e-4dad-960f-cd70b006c3e7-frr-conf\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: E0309 09:22:25.872768 4792 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 09 09:22:25 crc kubenswrapper[4792]: E0309 09:22:25.872907 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98a5f3b6-5d33-4542-9382-ea1d94e5f59f-metrics-certs podName:98a5f3b6-5d33-4542-9382-ea1d94e5f59f nodeName:}" failed. No retries permitted until 2026-03-09 09:22:26.372884424 +0000 UTC m=+911.403085176 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98a5f3b6-5d33-4542-9382-ea1d94e5f59f-metrics-certs") pod "controller-86ddb6bd46-2lfdv" (UID: "98a5f3b6-5d33-4542-9382-ea1d94e5f59f") : secret "controller-certs-secret" not found Mar 09 09:22:25 crc kubenswrapper[4792]: E0309 09:22:25.873044 4792 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 09 09:22:25 crc kubenswrapper[4792]: E0309 09:22:25.873175 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-metrics-certs podName:6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:26.373163712 +0000 UTC m=+911.403364464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-metrics-certs") pod "speaker-9fj2f" (UID: "6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9") : secret "speaker-certs-secret" not found Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.873498 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-metallb-excludel2\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.873612 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/10a41b58-f88e-4dad-960f-cd70b006c3e7-frr-startup\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: E0309 09:22:25.873763 4792 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 09:22:25 crc kubenswrapper[4792]: E0309 09:22:25.873805 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-memberlist podName:6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:26.373792761 +0000 UTC m=+911.403993613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-memberlist") pod "speaker-9fj2f" (UID: "6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9") : secret "metallb-memberlist" not found Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.873975 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/10a41b58-f88e-4dad-960f-cd70b006c3e7-frr-sockets\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.878778 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10a41b58-f88e-4dad-960f-cd70b006c3e7-metrics-certs\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.880245 4792 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.887990 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98a5f3b6-5d33-4542-9382-ea1d94e5f59f-cert\") pod \"controller-86ddb6bd46-2lfdv\" (UID: \"98a5f3b6-5d33-4542-9382-ea1d94e5f59f\") " pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.896939 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjdfg\" (UniqueName: \"kubernetes.io/projected/10a41b58-f88e-4dad-960f-cd70b006c3e7-kube-api-access-qjdfg\") pod \"frr-k8s-rll29\" (UID: \"10a41b58-f88e-4dad-960f-cd70b006c3e7\") " pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.905564 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z67th\" (UniqueName: \"kubernetes.io/projected/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-kube-api-access-z67th\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.908549 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8dj5\" (UniqueName: \"kubernetes.io/projected/7ac24411-eccd-496a-8a49-d9b552a92691-kube-api-access-z8dj5\") pod \"frr-k8s-webhook-server-7f989f654f-gm7zm\" (UID: \"7ac24411-eccd-496a-8a49-d9b552a92691\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" Mar 09 09:22:25 crc kubenswrapper[4792]: I0309 09:22:25.928669 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5n9g\" (UniqueName: \"kubernetes.io/projected/98a5f3b6-5d33-4542-9382-ea1d94e5f59f-kube-api-access-b5n9g\") pod \"controller-86ddb6bd46-2lfdv\" (UID: \"98a5f3b6-5d33-4542-9382-ea1d94e5f59f\") " pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:26 crc kubenswrapper[4792]: I0309 09:22:26.188223 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:26 crc kubenswrapper[4792]: I0309 09:22:26.376271 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-metrics-certs\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:26 crc kubenswrapper[4792]: I0309 09:22:26.376324 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98a5f3b6-5d33-4542-9382-ea1d94e5f59f-metrics-certs\") pod \"controller-86ddb6bd46-2lfdv\" (UID: \"98a5f3b6-5d33-4542-9382-ea1d94e5f59f\") " pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:26 crc kubenswrapper[4792]: I0309 09:22:26.376365 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ac24411-eccd-496a-8a49-d9b552a92691-cert\") pod \"frr-k8s-webhook-server-7f989f654f-gm7zm\" (UID: \"7ac24411-eccd-496a-8a49-d9b552a92691\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" Mar 09 09:22:26 crc kubenswrapper[4792]: I0309 09:22:26.376390 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-memberlist\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:26 crc kubenswrapper[4792]: E0309 09:22:26.376493 4792 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 09:22:26 crc kubenswrapper[4792]: E0309 09:22:26.376536 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-memberlist podName:6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:27.37652321 +0000 UTC m=+912.406723952 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-memberlist") pod "speaker-9fj2f" (UID: "6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9") : secret "metallb-memberlist" not found Mar 09 09:22:26 crc kubenswrapper[4792]: I0309 09:22:26.380244 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-metrics-certs\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:26 crc kubenswrapper[4792]: I0309 09:22:26.380400 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98a5f3b6-5d33-4542-9382-ea1d94e5f59f-metrics-certs\") pod \"controller-86ddb6bd46-2lfdv\" (UID: \"98a5f3b6-5d33-4542-9382-ea1d94e5f59f\") " pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:26 crc kubenswrapper[4792]: I0309 09:22:26.380538 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7ac24411-eccd-496a-8a49-d9b552a92691-cert\") pod \"frr-k8s-webhook-server-7f989f654f-gm7zm\" (UID: \"7ac24411-eccd-496a-8a49-d9b552a92691\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" Mar 09 09:22:26 crc kubenswrapper[4792]: I0309 09:22:26.575653 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" Mar 09 09:22:26 crc kubenswrapper[4792]: I0309 09:22:26.668546 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:26 crc kubenswrapper[4792]: I0309 09:22:26.759572 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rll29" event={"ID":"10a41b58-f88e-4dad-960f-cd70b006c3e7","Type":"ContainerStarted","Data":"45ace741549f94d5a0891c9f3532cd7911c8c3d08d1d5fd639e039a57277a7b5"} Mar 09 09:22:26 crc kubenswrapper[4792]: I0309 09:22:26.872592 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-2lfdv"] Mar 09 09:22:26 crc kubenswrapper[4792]: I0309 09:22:26.885683 4792 scope.go:117] "RemoveContainer" containerID="5be8783cce265fa21ed1012182d0b76ce654470b581e06540b32ebb82bc52e8e" Mar 09 09:22:27 crc kubenswrapper[4792]: I0309 09:22:27.020726 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm"] Mar 09 09:22:27 crc kubenswrapper[4792]: W0309 09:22:27.037900 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ac24411_eccd_496a_8a49_d9b552a92691.slice/crio-3f6cc95eb900a96815a888fde8c5871827a6956124a01cc83382b8c9df97f8e6 WatchSource:0}: Error finding container 3f6cc95eb900a96815a888fde8c5871827a6956124a01cc83382b8c9df97f8e6: Status 404 returned error can't find the container with id 3f6cc95eb900a96815a888fde8c5871827a6956124a01cc83382b8c9df97f8e6 Mar 09 09:22:27 crc kubenswrapper[4792]: I0309 09:22:27.397046 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-memberlist\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:27 crc kubenswrapper[4792]: I0309 09:22:27.406772 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9-memberlist\") pod \"speaker-9fj2f\" (UID: \"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9\") " pod="metallb-system/speaker-9fj2f" Mar 09 09:22:27 crc kubenswrapper[4792]: I0309 09:22:27.543391 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-9fj2f" Mar 09 09:22:27 crc kubenswrapper[4792]: W0309 09:22:27.562118 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ed0a6a3_dfa2_49c2_bbb2_96a6f3cfc4f9.slice/crio-04b1358b0f01f982fd2b05f1bd26b80d45244ab8e2f77d07b468a5a05a7dff92 WatchSource:0}: Error finding container 04b1358b0f01f982fd2b05f1bd26b80d45244ab8e2f77d07b468a5a05a7dff92: Status 404 returned error can't find the container with id 04b1358b0f01f982fd2b05f1bd26b80d45244ab8e2f77d07b468a5a05a7dff92 Mar 09 09:22:27 crc kubenswrapper[4792]: I0309 09:22:27.774076 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" event={"ID":"7ac24411-eccd-496a-8a49-d9b552a92691","Type":"ContainerStarted","Data":"3f6cc95eb900a96815a888fde8c5871827a6956124a01cc83382b8c9df97f8e6"} Mar 09 09:22:27 crc kubenswrapper[4792]: I0309 09:22:27.775489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9fj2f" event={"ID":"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9","Type":"ContainerStarted","Data":"04b1358b0f01f982fd2b05f1bd26b80d45244ab8e2f77d07b468a5a05a7dff92"} Mar 09 09:22:27 crc kubenswrapper[4792]: I0309 09:22:27.776976 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-2lfdv" event={"ID":"98a5f3b6-5d33-4542-9382-ea1d94e5f59f","Type":"ContainerStarted","Data":"032110f1ae13f8ca40e82bf2fdc4b3cb37e5d6d8ee04a788171fd5a2f1c0c5da"} Mar 09 09:22:27 crc kubenswrapper[4792]: I0309 09:22:27.777000 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-2lfdv" event={"ID":"98a5f3b6-5d33-4542-9382-ea1d94e5f59f","Type":"ContainerStarted","Data":"002e9055a821a35dca91bb9c42db35ce57339a7c38405b99ab83a137cf30621f"} Mar 09 09:22:27 crc kubenswrapper[4792]: I0309 09:22:27.777012 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-2lfdv" event={"ID":"98a5f3b6-5d33-4542-9382-ea1d94e5f59f","Type":"ContainerStarted","Data":"1f9c146ec6d48b1286d235d399858156a72fc6823bbb50e3bf0a3a4e3130dc31"} Mar 09 09:22:27 crc kubenswrapper[4792]: I0309 09:22:27.777988 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:27 crc kubenswrapper[4792]: I0309 09:22:27.814906 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-2lfdv" podStartSLOduration=2.814889657 podStartE2EDuration="2.814889657s" podCreationTimestamp="2026-03-09 09:22:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:27.810485997 +0000 UTC m=+912.840686749" watchObservedRunningTime="2026-03-09 09:22:27.814889657 +0000 UTC m=+912.845090409" Mar 09 09:22:28 crc kubenswrapper[4792]: I0309 09:22:28.792811 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9fj2f" event={"ID":"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9","Type":"ContainerStarted","Data":"dc8141d06cbf69fdfb68dd723bbae6ade7123635482cf7f0ee9b8c4b801edf09"} Mar 09 09:22:28 crc kubenswrapper[4792]: I0309 09:22:28.793048 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-9fj2f" Mar 09 09:22:28 crc kubenswrapper[4792]: I0309 09:22:28.793058 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9fj2f" event={"ID":"6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9","Type":"ContainerStarted","Data":"b7f41412c113046e90ba8756f49aa70b9eb6ba801fa421d991711ad46fa8056f"} Mar 09 09:22:28 crc kubenswrapper[4792]: I0309 09:22:28.821821 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-9fj2f" podStartSLOduration=3.821800057 podStartE2EDuration="3.821800057s" podCreationTimestamp="2026-03-09 09:22:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:28.81882288 +0000 UTC m=+913.849023632" watchObservedRunningTime="2026-03-09 09:22:28.821800057 +0000 UTC m=+913.852000809" Mar 09 09:22:32 crc kubenswrapper[4792]: I0309 09:22:32.019169 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-84rfv"] Mar 09 09:22:32 crc kubenswrapper[4792]: I0309 09:22:32.024318 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:32 crc kubenswrapper[4792]: I0309 09:22:32.025525 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84rfv"] Mar 09 09:22:32 crc kubenswrapper[4792]: I0309 09:22:32.090047 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-utilities\") pod \"certified-operators-84rfv\" (UID: \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\") " pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:32 crc kubenswrapper[4792]: I0309 09:22:32.090408 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-catalog-content\") pod \"certified-operators-84rfv\" (UID: \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\") " pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:32 crc kubenswrapper[4792]: I0309 09:22:32.090497 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q78t7\" (UniqueName: \"kubernetes.io/projected/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-kube-api-access-q78t7\") pod \"certified-operators-84rfv\" (UID: \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\") " pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:32 crc kubenswrapper[4792]: I0309 09:22:32.192420 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-utilities\") pod \"certified-operators-84rfv\" (UID: \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\") " pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:32 crc kubenswrapper[4792]: I0309 09:22:32.192814 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-catalog-content\") pod \"certified-operators-84rfv\" (UID: \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\") " pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:32 crc kubenswrapper[4792]: I0309 09:22:32.192908 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q78t7\" (UniqueName: \"kubernetes.io/projected/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-kube-api-access-q78t7\") pod \"certified-operators-84rfv\" (UID: \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\") " pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:32 crc kubenswrapper[4792]: I0309 09:22:32.193195 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-utilities\") pod \"certified-operators-84rfv\" (UID: \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\") " pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:32 crc kubenswrapper[4792]: I0309 09:22:32.193539 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-catalog-content\") pod \"certified-operators-84rfv\" (UID: \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\") " pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:32 crc kubenswrapper[4792]: I0309 09:22:32.222446 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q78t7\" (UniqueName: \"kubernetes.io/projected/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-kube-api-access-q78t7\") pod \"certified-operators-84rfv\" (UID: \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\") " pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:32 crc kubenswrapper[4792]: I0309 09:22:32.344047 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:35 crc kubenswrapper[4792]: I0309 09:22:35.334627 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84rfv"] Mar 09 09:22:35 crc kubenswrapper[4792]: W0309 09:22:35.344327 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad6f0506_47f5_4eb1_9ee8_6a025fc97c59.slice/crio-ee101272a5188524ab28218eb5593c3a504d37f8488bfa9408de8372545c91f2 WatchSource:0}: Error finding container ee101272a5188524ab28218eb5593c3a504d37f8488bfa9408de8372545c91f2: Status 404 returned error can't find the container with id ee101272a5188524ab28218eb5593c3a504d37f8488bfa9408de8372545c91f2 Mar 09 09:22:35 crc kubenswrapper[4792]: I0309 09:22:35.836037 4792 generic.go:334] "Generic (PLEG): container finished" podID="10a41b58-f88e-4dad-960f-cd70b006c3e7" containerID="c9627025f1f7879dbad9595d76d72e76870a76617d25b8b61bcb261fb48d872a" exitCode=0 Mar 09 09:22:35 crc kubenswrapper[4792]: I0309 09:22:35.836155 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rll29" event={"ID":"10a41b58-f88e-4dad-960f-cd70b006c3e7","Type":"ContainerDied","Data":"c9627025f1f7879dbad9595d76d72e76870a76617d25b8b61bcb261fb48d872a"} Mar 09 09:22:35 crc kubenswrapper[4792]: I0309 09:22:35.838410 4792 generic.go:334] "Generic (PLEG): container finished" podID="ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" containerID="0ad262434eab6e4d216f3615bb38b6a991ce02979313282a21c18b3aeda7cad4" exitCode=0 Mar 09 09:22:35 crc kubenswrapper[4792]: I0309 09:22:35.838462 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84rfv" event={"ID":"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59","Type":"ContainerDied","Data":"0ad262434eab6e4d216f3615bb38b6a991ce02979313282a21c18b3aeda7cad4"} Mar 09 09:22:35 crc kubenswrapper[4792]: I0309 09:22:35.838506 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84rfv" event={"ID":"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59","Type":"ContainerStarted","Data":"ee101272a5188524ab28218eb5593c3a504d37f8488bfa9408de8372545c91f2"} Mar 09 09:22:35 crc kubenswrapper[4792]: I0309 09:22:35.840040 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" event={"ID":"7ac24411-eccd-496a-8a49-d9b552a92691","Type":"ContainerStarted","Data":"341b0ce3527e9e7df507b61e9bda66f89e75aab992aafb6ad045e8f03544e178"} Mar 09 09:22:35 crc kubenswrapper[4792]: I0309 09:22:35.840268 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" Mar 09 09:22:35 crc kubenswrapper[4792]: I0309 09:22:35.901216 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" podStartSLOduration=2.83092253 podStartE2EDuration="10.901192547s" podCreationTimestamp="2026-03-09 09:22:25 +0000 UTC" firstStartedPulling="2026-03-09 09:22:27.043466942 +0000 UTC m=+912.073667694" lastFinishedPulling="2026-03-09 09:22:35.113736959 +0000 UTC m=+920.143937711" observedRunningTime="2026-03-09 09:22:35.896424906 +0000 UTC m=+920.926625668" watchObservedRunningTime="2026-03-09 09:22:35.901192547 +0000 UTC m=+920.931393299" Mar 09 09:22:36 crc kubenswrapper[4792]: I0309 09:22:36.847293 4792 generic.go:334] "Generic (PLEG): container finished" podID="10a41b58-f88e-4dad-960f-cd70b006c3e7" containerID="04d6009af41a55469a6c6e9cfbb3e4b53fce41afc212df6a645cec898eb7a2ce" exitCode=0 Mar 09 09:22:36 crc kubenswrapper[4792]: I0309 09:22:36.847352 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rll29" event={"ID":"10a41b58-f88e-4dad-960f-cd70b006c3e7","Type":"ContainerDied","Data":"04d6009af41a55469a6c6e9cfbb3e4b53fce41afc212df6a645cec898eb7a2ce"} Mar 09 09:22:36 crc kubenswrapper[4792]: I0309 09:22:36.857693 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84rfv" event={"ID":"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59","Type":"ContainerStarted","Data":"b3f8cf2f113d6b1df07de4b3d94a19484a5ba1e94d7ecfd38697c87f015d5ce5"} Mar 09 09:22:37 crc kubenswrapper[4792]: I0309 09:22:37.547428 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-9fj2f" Mar 09 09:22:37 crc kubenswrapper[4792]: I0309 09:22:37.862949 4792 generic.go:334] "Generic (PLEG): container finished" podID="ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" containerID="b3f8cf2f113d6b1df07de4b3d94a19484a5ba1e94d7ecfd38697c87f015d5ce5" exitCode=0 Mar 09 09:22:37 crc kubenswrapper[4792]: I0309 09:22:37.863908 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84rfv" event={"ID":"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59","Type":"ContainerDied","Data":"b3f8cf2f113d6b1df07de4b3d94a19484a5ba1e94d7ecfd38697c87f015d5ce5"} Mar 09 09:22:37 crc kubenswrapper[4792]: I0309 09:22:37.867259 4792 generic.go:334] "Generic (PLEG): container finished" podID="10a41b58-f88e-4dad-960f-cd70b006c3e7" containerID="ab5c0865978b74a22de6bf27505e5c23fd04cd537700257cc05177155f6d9ac2" exitCode=0 Mar 09 09:22:37 crc kubenswrapper[4792]: I0309 09:22:37.867292 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rll29" event={"ID":"10a41b58-f88e-4dad-960f-cd70b006c3e7","Type":"ContainerDied","Data":"ab5c0865978b74a22de6bf27505e5c23fd04cd537700257cc05177155f6d9ac2"} Mar 09 09:22:38 crc kubenswrapper[4792]: I0309 09:22:38.878032 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rll29" event={"ID":"10a41b58-f88e-4dad-960f-cd70b006c3e7","Type":"ContainerStarted","Data":"2dd4e5926042ed828b0f81121b5d368794cb386539f7bedf2136c8c699232a65"} Mar 09 09:22:38 crc kubenswrapper[4792]: I0309 09:22:38.878387 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rll29" event={"ID":"10a41b58-f88e-4dad-960f-cd70b006c3e7","Type":"ContainerStarted","Data":"2a57624fcbbcee2d0ebd59f8e06dad8d19aadf5c0d2d3d1bf836775228f0e6fb"} Mar 09 09:22:38 crc kubenswrapper[4792]: I0309 09:22:38.878399 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rll29" event={"ID":"10a41b58-f88e-4dad-960f-cd70b006c3e7","Type":"ContainerStarted","Data":"96e2ec6d19448bb885411058be8da22ab806e8586150e9b9dd1d421de9b86eaf"} Mar 09 09:22:38 crc kubenswrapper[4792]: I0309 09:22:38.878407 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rll29" event={"ID":"10a41b58-f88e-4dad-960f-cd70b006c3e7","Type":"ContainerStarted","Data":"81b27db7d87ddc86fd49ec311f80c2564de553a83a2ea4dfdcccb007be7dfb08"} Mar 09 09:22:38 crc kubenswrapper[4792]: I0309 09:22:38.878416 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rll29" event={"ID":"10a41b58-f88e-4dad-960f-cd70b006c3e7","Type":"ContainerStarted","Data":"02cd5b543a3cb3d2b235ef5ca99cdd805af65f8c91d62b0dee83cfae56c6756d"} Mar 09 09:22:38 crc kubenswrapper[4792]: I0309 09:22:38.878423 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rll29" event={"ID":"10a41b58-f88e-4dad-960f-cd70b006c3e7","Type":"ContainerStarted","Data":"4d19e0e9fcc0d9c0a1673458c4d1247c52ceeeb07aaa7560aae083aff979946c"} Mar 09 09:22:38 crc kubenswrapper[4792]: I0309 09:22:38.878462 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:38 crc kubenswrapper[4792]: I0309 09:22:38.880593 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84rfv" event={"ID":"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59","Type":"ContainerStarted","Data":"eced964fc51442edd7796bed1b7aa92b9b48794b292da397bf4fa97966a27c57"} Mar 09 09:22:38 crc kubenswrapper[4792]: I0309 09:22:38.902688 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-rll29" podStartSLOduration=5.194723504 podStartE2EDuration="13.9026664s" podCreationTimestamp="2026-03-09 09:22:25 +0000 UTC" firstStartedPulling="2026-03-09 09:22:26.358857579 +0000 UTC m=+911.389058331" lastFinishedPulling="2026-03-09 09:22:35.066800475 +0000 UTC m=+920.097001227" observedRunningTime="2026-03-09 09:22:38.90129595 +0000 UTC m=+923.931496702" watchObservedRunningTime="2026-03-09 09:22:38.9026664 +0000 UTC m=+923.932867152" Mar 09 09:22:38 crc kubenswrapper[4792]: I0309 09:22:38.920746 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-84rfv" podStartSLOduration=4.4735818 podStartE2EDuration="6.920732793s" podCreationTimestamp="2026-03-09 09:22:32 +0000 UTC" firstStartedPulling="2026-03-09 09:22:35.839756205 +0000 UTC m=+920.869956957" lastFinishedPulling="2026-03-09 09:22:38.286907198 +0000 UTC m=+923.317107950" observedRunningTime="2026-03-09 09:22:38.917687763 +0000 UTC m=+923.947888525" watchObservedRunningTime="2026-03-09 09:22:38.920732793 +0000 UTC m=+923.950933545" Mar 09 09:22:41 crc kubenswrapper[4792]: I0309 09:22:41.188809 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:41 crc kubenswrapper[4792]: I0309 09:22:41.239196 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:42 crc kubenswrapper[4792]: I0309 09:22:42.345157 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:42 crc kubenswrapper[4792]: I0309 09:22:42.345954 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:42 crc kubenswrapper[4792]: I0309 09:22:42.384295 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:42 crc kubenswrapper[4792]: I0309 09:22:42.795335 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rjx9b"] Mar 09 09:22:42 crc kubenswrapper[4792]: I0309 09:22:42.796039 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rjx9b" Mar 09 09:22:42 crc kubenswrapper[4792]: I0309 09:22:42.800344 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 09 09:22:42 crc kubenswrapper[4792]: I0309 09:22:42.800632 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 09 09:22:42 crc kubenswrapper[4792]: I0309 09:22:42.801560 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-2svzr" Mar 09 09:22:42 crc kubenswrapper[4792]: I0309 09:22:42.806792 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rjx9b"] Mar 09 09:22:42 crc kubenswrapper[4792]: I0309 09:22:42.846582 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq7nh\" (UniqueName: \"kubernetes.io/projected/93e20f26-20b1-409a-8663-61cd1a7a71d3-kube-api-access-xq7nh\") pod \"openstack-operator-index-rjx9b\" (UID: \"93e20f26-20b1-409a-8663-61cd1a7a71d3\") " pod="openstack-operators/openstack-operator-index-rjx9b" Mar 09 09:22:42 crc kubenswrapper[4792]: I0309 09:22:42.947747 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq7nh\" (UniqueName: \"kubernetes.io/projected/93e20f26-20b1-409a-8663-61cd1a7a71d3-kube-api-access-xq7nh\") pod \"openstack-operator-index-rjx9b\" (UID: \"93e20f26-20b1-409a-8663-61cd1a7a71d3\") " pod="openstack-operators/openstack-operator-index-rjx9b" Mar 09 09:22:42 crc kubenswrapper[4792]: I0309 09:22:42.967203 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq7nh\" (UniqueName: \"kubernetes.io/projected/93e20f26-20b1-409a-8663-61cd1a7a71d3-kube-api-access-xq7nh\") pod \"openstack-operator-index-rjx9b\" (UID: \"93e20f26-20b1-409a-8663-61cd1a7a71d3\") " pod="openstack-operators/openstack-operator-index-rjx9b" Mar 09 09:22:43 crc kubenswrapper[4792]: I0309 09:22:43.115318 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rjx9b" Mar 09 09:22:43 crc kubenswrapper[4792]: I0309 09:22:43.522067 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rjx9b"] Mar 09 09:22:43 crc kubenswrapper[4792]: I0309 09:22:43.909716 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rjx9b" event={"ID":"93e20f26-20b1-409a-8663-61cd1a7a71d3","Type":"ContainerStarted","Data":"3aa33f951c6f45e2514ebc350a0e3cabc6a8cf43b4f8d8af64b9974edd8f3ea6"} Mar 09 09:22:43 crc kubenswrapper[4792]: I0309 09:22:43.949688 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:44 crc kubenswrapper[4792]: I0309 09:22:44.916932 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rjx9b" event={"ID":"93e20f26-20b1-409a-8663-61cd1a7a71d3","Type":"ContainerStarted","Data":"1ffe88450e1d5dbb4af1e5883e539df31e6b3c11be653d65554cf311e26f7f19"} Mar 09 09:22:44 crc kubenswrapper[4792]: I0309 09:22:44.932542 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rjx9b" podStartSLOduration=2.188252019 podStartE2EDuration="2.932523552s" podCreationTimestamp="2026-03-09 09:22:42 +0000 UTC" firstStartedPulling="2026-03-09 09:22:43.536192255 +0000 UTC m=+928.566393017" lastFinishedPulling="2026-03-09 09:22:44.280463798 +0000 UTC m=+929.310664550" observedRunningTime="2026-03-09 09:22:44.929314427 +0000 UTC m=+929.959515179" watchObservedRunningTime="2026-03-09 09:22:44.932523552 +0000 UTC m=+929.962724334" Mar 09 09:22:46 crc kubenswrapper[4792]: I0309 09:22:46.580487 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-gm7zm" Mar 09 09:22:46 crc kubenswrapper[4792]: I0309 09:22:46.673261 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-2lfdv" Mar 09 09:22:47 crc kubenswrapper[4792]: I0309 09:22:47.203785 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84rfv"] Mar 09 09:22:47 crc kubenswrapper[4792]: I0309 09:22:47.204574 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-84rfv" podUID="ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" containerName="registry-server" containerID="cri-o://eced964fc51442edd7796bed1b7aa92b9b48794b292da397bf4fa97966a27c57" gracePeriod=2 Mar 09 09:22:47 crc kubenswrapper[4792]: I0309 09:22:47.937220 4792 generic.go:334] "Generic (PLEG): container finished" podID="ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" containerID="eced964fc51442edd7796bed1b7aa92b9b48794b292da397bf4fa97966a27c57" exitCode=0 Mar 09 09:22:47 crc kubenswrapper[4792]: I0309 09:22:47.937262 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84rfv" event={"ID":"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59","Type":"ContainerDied","Data":"eced964fc51442edd7796bed1b7aa92b9b48794b292da397bf4fa97966a27c57"} Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.147917 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.215245 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-catalog-content\") pod \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\" (UID: \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\") " Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.215318 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q78t7\" (UniqueName: \"kubernetes.io/projected/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-kube-api-access-q78t7\") pod \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\" (UID: \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\") " Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.215394 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-utilities\") pod \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\" (UID: \"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59\") " Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.216674 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-utilities" (OuterVolumeSpecName: "utilities") pod "ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" (UID: "ad6f0506-47f5-4eb1-9ee8-6a025fc97c59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.221318 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-kube-api-access-q78t7" (OuterVolumeSpecName: "kube-api-access-q78t7") pod "ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" (UID: "ad6f0506-47f5-4eb1-9ee8-6a025fc97c59"). InnerVolumeSpecName "kube-api-access-q78t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.274802 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" (UID: "ad6f0506-47f5-4eb1-9ee8-6a025fc97c59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.317044 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.317098 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q78t7\" (UniqueName: \"kubernetes.io/projected/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-kube-api-access-q78t7\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.317114 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.944253 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84rfv" event={"ID":"ad6f0506-47f5-4eb1-9ee8-6a025fc97c59","Type":"ContainerDied","Data":"ee101272a5188524ab28218eb5593c3a504d37f8488bfa9408de8372545c91f2"} Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.944300 4792 scope.go:117] "RemoveContainer" containerID="eced964fc51442edd7796bed1b7aa92b9b48794b292da397bf4fa97966a27c57" Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.944396 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84rfv" Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.965081 4792 scope.go:117] "RemoveContainer" containerID="b3f8cf2f113d6b1df07de4b3d94a19484a5ba1e94d7ecfd38697c87f015d5ce5" Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.978737 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84rfv"] Mar 09 09:22:48 crc kubenswrapper[4792]: I0309 09:22:48.987802 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-84rfv"] Mar 09 09:22:49 crc kubenswrapper[4792]: I0309 09:22:49.000092 4792 scope.go:117] "RemoveContainer" containerID="0ad262434eab6e4d216f3615bb38b6a991ce02979313282a21c18b3aeda7cad4" Mar 09 09:22:49 crc kubenswrapper[4792]: I0309 09:22:49.672508 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" path="/var/lib/kubelet/pods/ad6f0506-47f5-4eb1-9ee8-6a025fc97c59/volumes" Mar 09 09:22:53 crc kubenswrapper[4792]: I0309 09:22:53.116001 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-rjx9b" Mar 09 09:22:53 crc kubenswrapper[4792]: I0309 09:22:53.116420 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-rjx9b" Mar 09 09:22:53 crc kubenswrapper[4792]: I0309 09:22:53.154261 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-rjx9b" Mar 09 09:22:54 crc kubenswrapper[4792]: I0309 09:22:54.000688 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-rjx9b" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.232666 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp"] Mar 09 09:22:55 crc kubenswrapper[4792]: E0309 09:22:55.233220 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" containerName="extract-utilities" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.233237 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" containerName="extract-utilities" Mar 09 09:22:55 crc kubenswrapper[4792]: E0309 09:22:55.233254 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" containerName="registry-server" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.233262 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" containerName="registry-server" Mar 09 09:22:55 crc kubenswrapper[4792]: E0309 09:22:55.233294 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" containerName="extract-content" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.233303 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" containerName="extract-content" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.233449 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad6f0506-47f5-4eb1-9ee8-6a025fc97c59" containerName="registry-server" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.234467 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.237099 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-xlcf5" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.248323 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp"] Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.313339 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3412edec-dc99-4713-b6bf-cebdace9f6a6-util\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp\" (UID: \"3412edec-dc99-4713-b6bf-cebdace9f6a6\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.313394 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3412edec-dc99-4713-b6bf-cebdace9f6a6-bundle\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp\" (UID: \"3412edec-dc99-4713-b6bf-cebdace9f6a6\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.313531 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8846\" (UniqueName: \"kubernetes.io/projected/3412edec-dc99-4713-b6bf-cebdace9f6a6-kube-api-access-h8846\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp\" (UID: \"3412edec-dc99-4713-b6bf-cebdace9f6a6\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.414154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8846\" (UniqueName: \"kubernetes.io/projected/3412edec-dc99-4713-b6bf-cebdace9f6a6-kube-api-access-h8846\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp\" (UID: \"3412edec-dc99-4713-b6bf-cebdace9f6a6\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.414469 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3412edec-dc99-4713-b6bf-cebdace9f6a6-util\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp\" (UID: \"3412edec-dc99-4713-b6bf-cebdace9f6a6\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.414501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3412edec-dc99-4713-b6bf-cebdace9f6a6-bundle\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp\" (UID: \"3412edec-dc99-4713-b6bf-cebdace9f6a6\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.414956 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3412edec-dc99-4713-b6bf-cebdace9f6a6-bundle\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp\" (UID: \"3412edec-dc99-4713-b6bf-cebdace9f6a6\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.415051 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3412edec-dc99-4713-b6bf-cebdace9f6a6-util\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp\" (UID: \"3412edec-dc99-4713-b6bf-cebdace9f6a6\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.432954 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8846\" (UniqueName: \"kubernetes.io/projected/3412edec-dc99-4713-b6bf-cebdace9f6a6-kube-api-access-h8846\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp\" (UID: \"3412edec-dc99-4713-b6bf-cebdace9f6a6\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.554625 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.756329 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp"] Mar 09 09:22:55 crc kubenswrapper[4792]: W0309 09:22:55.759549 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3412edec_dc99_4713_b6bf_cebdace9f6a6.slice/crio-0efd6c3d57e7ef07dc9d04800a158d1edbee05c9080c83ae53290b29fd0ab8d0 WatchSource:0}: Error finding container 0efd6c3d57e7ef07dc9d04800a158d1edbee05c9080c83ae53290b29fd0ab8d0: Status 404 returned error can't find the container with id 0efd6c3d57e7ef07dc9d04800a158d1edbee05c9080c83ae53290b29fd0ab8d0 Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.990703 4792 generic.go:334] "Generic (PLEG): container finished" podID="3412edec-dc99-4713-b6bf-cebdace9f6a6" containerID="53f37dbbc9f70790188b1944ca633e5f84d46e79b52e282177d6a7138a51c840" exitCode=0 Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.990768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" event={"ID":"3412edec-dc99-4713-b6bf-cebdace9f6a6","Type":"ContainerDied","Data":"53f37dbbc9f70790188b1944ca633e5f84d46e79b52e282177d6a7138a51c840"} Mar 09 09:22:55 crc kubenswrapper[4792]: I0309 09:22:55.990792 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" event={"ID":"3412edec-dc99-4713-b6bf-cebdace9f6a6","Type":"ContainerStarted","Data":"0efd6c3d57e7ef07dc9d04800a158d1edbee05c9080c83ae53290b29fd0ab8d0"} Mar 09 09:22:56 crc kubenswrapper[4792]: I0309 09:22:56.192439 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-rll29" Mar 09 09:22:57 crc kubenswrapper[4792]: I0309 09:22:57.000414 4792 generic.go:334] "Generic (PLEG): container finished" podID="3412edec-dc99-4713-b6bf-cebdace9f6a6" containerID="46609a2697045460797e0effa3db1411c5fefb7d75f4ca69a3b7a17d4fafe6e3" exitCode=0 Mar 09 09:22:57 crc kubenswrapper[4792]: I0309 09:22:57.000515 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" event={"ID":"3412edec-dc99-4713-b6bf-cebdace9f6a6","Type":"ContainerDied","Data":"46609a2697045460797e0effa3db1411c5fefb7d75f4ca69a3b7a17d4fafe6e3"} Mar 09 09:22:58 crc kubenswrapper[4792]: I0309 09:22:58.008659 4792 generic.go:334] "Generic (PLEG): container finished" podID="3412edec-dc99-4713-b6bf-cebdace9f6a6" containerID="43030727049b07e2e9ec069588227dd9dc06b4f6e03b58088d903d0e38e0714c" exitCode=0 Mar 09 09:22:58 crc kubenswrapper[4792]: I0309 09:22:58.008718 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" event={"ID":"3412edec-dc99-4713-b6bf-cebdace9f6a6","Type":"ContainerDied","Data":"43030727049b07e2e9ec069588227dd9dc06b4f6e03b58088d903d0e38e0714c"} Mar 09 09:22:59 crc kubenswrapper[4792]: I0309 09:22:59.373639 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" Mar 09 09:22:59 crc kubenswrapper[4792]: I0309 09:22:59.469548 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3412edec-dc99-4713-b6bf-cebdace9f6a6-util\") pod \"3412edec-dc99-4713-b6bf-cebdace9f6a6\" (UID: \"3412edec-dc99-4713-b6bf-cebdace9f6a6\") " Mar 09 09:22:59 crc kubenswrapper[4792]: I0309 09:22:59.469588 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8846\" (UniqueName: \"kubernetes.io/projected/3412edec-dc99-4713-b6bf-cebdace9f6a6-kube-api-access-h8846\") pod \"3412edec-dc99-4713-b6bf-cebdace9f6a6\" (UID: \"3412edec-dc99-4713-b6bf-cebdace9f6a6\") " Mar 09 09:22:59 crc kubenswrapper[4792]: I0309 09:22:59.469636 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3412edec-dc99-4713-b6bf-cebdace9f6a6-bundle\") pod \"3412edec-dc99-4713-b6bf-cebdace9f6a6\" (UID: \"3412edec-dc99-4713-b6bf-cebdace9f6a6\") " Mar 09 09:22:59 crc kubenswrapper[4792]: I0309 09:22:59.470310 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3412edec-dc99-4713-b6bf-cebdace9f6a6-bundle" (OuterVolumeSpecName: "bundle") pod "3412edec-dc99-4713-b6bf-cebdace9f6a6" (UID: "3412edec-dc99-4713-b6bf-cebdace9f6a6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:22:59 crc kubenswrapper[4792]: I0309 09:22:59.481232 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3412edec-dc99-4713-b6bf-cebdace9f6a6-kube-api-access-h8846" (OuterVolumeSpecName: "kube-api-access-h8846") pod "3412edec-dc99-4713-b6bf-cebdace9f6a6" (UID: "3412edec-dc99-4713-b6bf-cebdace9f6a6"). InnerVolumeSpecName "kube-api-access-h8846". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:59 crc kubenswrapper[4792]: I0309 09:22:59.483014 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3412edec-dc99-4713-b6bf-cebdace9f6a6-util" (OuterVolumeSpecName: "util") pod "3412edec-dc99-4713-b6bf-cebdace9f6a6" (UID: "3412edec-dc99-4713-b6bf-cebdace9f6a6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:22:59 crc kubenswrapper[4792]: I0309 09:22:59.572163 4792 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3412edec-dc99-4713-b6bf-cebdace9f6a6-util\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:59 crc kubenswrapper[4792]: I0309 09:22:59.572239 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8846\" (UniqueName: \"kubernetes.io/projected/3412edec-dc99-4713-b6bf-cebdace9f6a6-kube-api-access-h8846\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:59 crc kubenswrapper[4792]: I0309 09:22:59.572291 4792 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3412edec-dc99-4713-b6bf-cebdace9f6a6-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:00 crc kubenswrapper[4792]: I0309 09:23:00.027829 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" event={"ID":"3412edec-dc99-4713-b6bf-cebdace9f6a6","Type":"ContainerDied","Data":"0efd6c3d57e7ef07dc9d04800a158d1edbee05c9080c83ae53290b29fd0ab8d0"} Mar 09 09:23:00 crc kubenswrapper[4792]: I0309 09:23:00.027882 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0efd6c3d57e7ef07dc9d04800a158d1edbee05c9080c83ae53290b29fd0ab8d0" Mar 09 09:23:00 crc kubenswrapper[4792]: I0309 09:23:00.027926 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp" Mar 09 09:23:02 crc kubenswrapper[4792]: I0309 09:23:02.823103 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-568b7cf6db-hz254"] Mar 09 09:23:02 crc kubenswrapper[4792]: E0309 09:23:02.823693 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3412edec-dc99-4713-b6bf-cebdace9f6a6" containerName="util" Mar 09 09:23:02 crc kubenswrapper[4792]: I0309 09:23:02.823709 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3412edec-dc99-4713-b6bf-cebdace9f6a6" containerName="util" Mar 09 09:23:02 crc kubenswrapper[4792]: E0309 09:23:02.823729 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3412edec-dc99-4713-b6bf-cebdace9f6a6" containerName="extract" Mar 09 09:23:02 crc kubenswrapper[4792]: I0309 09:23:02.823737 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3412edec-dc99-4713-b6bf-cebdace9f6a6" containerName="extract" Mar 09 09:23:02 crc kubenswrapper[4792]: E0309 09:23:02.823748 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3412edec-dc99-4713-b6bf-cebdace9f6a6" containerName="pull" Mar 09 09:23:02 crc kubenswrapper[4792]: I0309 09:23:02.823756 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3412edec-dc99-4713-b6bf-cebdace9f6a6" containerName="pull" Mar 09 09:23:02 crc kubenswrapper[4792]: I0309 09:23:02.823883 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3412edec-dc99-4713-b6bf-cebdace9f6a6" containerName="extract" Mar 09 09:23:02 crc kubenswrapper[4792]: I0309 09:23:02.824286 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-hz254" Mar 09 09:23:02 crc kubenswrapper[4792]: I0309 09:23:02.826581 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-t5rpj" Mar 09 09:23:02 crc kubenswrapper[4792]: I0309 09:23:02.878882 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-568b7cf6db-hz254"] Mar 09 09:23:02 crc kubenswrapper[4792]: I0309 09:23:02.914966 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhwhh\" (UniqueName: \"kubernetes.io/projected/03eb7926-dd55-4d02-a695-5abcb5a02cdc-kube-api-access-rhwhh\") pod \"openstack-operator-controller-init-568b7cf6db-hz254\" (UID: \"03eb7926-dd55-4d02-a695-5abcb5a02cdc\") " pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-hz254" Mar 09 09:23:03 crc kubenswrapper[4792]: I0309 09:23:03.016161 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhwhh\" (UniqueName: \"kubernetes.io/projected/03eb7926-dd55-4d02-a695-5abcb5a02cdc-kube-api-access-rhwhh\") pod \"openstack-operator-controller-init-568b7cf6db-hz254\" (UID: \"03eb7926-dd55-4d02-a695-5abcb5a02cdc\") " pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-hz254" Mar 09 09:23:03 crc kubenswrapper[4792]: I0309 09:23:03.035713 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhwhh\" (UniqueName: \"kubernetes.io/projected/03eb7926-dd55-4d02-a695-5abcb5a02cdc-kube-api-access-rhwhh\") pod \"openstack-operator-controller-init-568b7cf6db-hz254\" (UID: \"03eb7926-dd55-4d02-a695-5abcb5a02cdc\") " pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-hz254" Mar 09 09:23:03 crc kubenswrapper[4792]: I0309 09:23:03.149690 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-hz254" Mar 09 09:23:03 crc kubenswrapper[4792]: I0309 09:23:03.634650 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-568b7cf6db-hz254"] Mar 09 09:23:04 crc kubenswrapper[4792]: I0309 09:23:04.050016 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-hz254" event={"ID":"03eb7926-dd55-4d02-a695-5abcb5a02cdc","Type":"ContainerStarted","Data":"56d6092cc90328ea949d8741389b33e2a4b40d8463357bb0852292e416b669fb"} Mar 09 09:23:09 crc kubenswrapper[4792]: I0309 09:23:09.082809 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-hz254" event={"ID":"03eb7926-dd55-4d02-a695-5abcb5a02cdc","Type":"ContainerStarted","Data":"ea21a936dcbcf7eef62ff42d9c88f6abe81dbcd79e3787b438ff64cec029eca1"} Mar 09 09:23:09 crc kubenswrapper[4792]: I0309 09:23:09.084364 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-hz254" Mar 09 09:23:13 crc kubenswrapper[4792]: I0309 09:23:13.152545 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-hz254" Mar 09 09:23:13 crc kubenswrapper[4792]: I0309 09:23:13.178563 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-hz254" podStartSLOduration=6.22660768 podStartE2EDuration="11.178540306s" podCreationTimestamp="2026-03-09 09:23:02 +0000 UTC" firstStartedPulling="2026-03-09 09:23:03.648597565 +0000 UTC m=+948.678798317" lastFinishedPulling="2026-03-09 09:23:08.600530191 +0000 UTC m=+953.630730943" observedRunningTime="2026-03-09 09:23:09.125270629 +0000 UTC m=+954.155471401" watchObservedRunningTime="2026-03-09 09:23:13.178540306 +0000 UTC m=+958.208741068" Mar 09 09:23:13 crc kubenswrapper[4792]: I0309 09:23:13.214254 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:23:13 crc kubenswrapper[4792]: I0309 09:23:13.214312 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.488985 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.491674 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.494091 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-j8ql9" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.505171 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.514199 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7fdsl"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.515029 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7fdsl" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.526098 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-h969x" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.527997 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.529035 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.535420 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-shxdn" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.555058 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk6dh\" (UniqueName: \"kubernetes.io/projected/d9dc8da2-0584-4db0-ad3a-f1c59c2f6028-kube-api-access-zk6dh\") pod \"barbican-operator-controller-manager-6db6876945-9jfp7\" (UID: \"d9dc8da2-0584-4db0-ad3a-f1c59c2f6028\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.555197 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlstv\" (UniqueName: \"kubernetes.io/projected/89b0f1f9-11f1-4d01-a2b8-ca2f1fae3bb2-kube-api-access-hlstv\") pod \"cinder-operator-controller-manager-55d77d7b5c-7fdsl\" (UID: \"89b0f1f9-11f1-4d01-a2b8-ca2f1fae3bb2\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7fdsl" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.555226 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghkjg\" (UniqueName: \"kubernetes.io/projected/20b2fb83-c944-4553-b506-9ff3c9c199f5-kube-api-access-ghkjg\") pod \"designate-operator-controller-manager-5d87c9d997-r5d6p\" (UID: \"20b2fb83-c944-4553-b506-9ff3c9c199f5\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.566403 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.593109 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7fdsl"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.605869 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-tfz6b"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.606886 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-tfz6b" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.610716 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-kblcj" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.611236 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.611968 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.615259 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-sr2hk" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.635856 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-tfz6b"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.650340 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.659444 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmt88\" (UniqueName: \"kubernetes.io/projected/b1140422-6cf3-4e92-95e2-6ea31179de28-kube-api-access-jmt88\") pod \"heat-operator-controller-manager-cf99c678f-lw2kp\" (UID: \"b1140422-6cf3-4e92-95e2-6ea31179de28\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.659521 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk6dh\" (UniqueName: \"kubernetes.io/projected/d9dc8da2-0584-4db0-ad3a-f1c59c2f6028-kube-api-access-zk6dh\") pod \"barbican-operator-controller-manager-6db6876945-9jfp7\" (UID: \"d9dc8da2-0584-4db0-ad3a-f1c59c2f6028\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.659579 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcfht\" (UniqueName: \"kubernetes.io/projected/b74999f3-cb46-4b35-a70f-71977b54d944-kube-api-access-xcfht\") pod \"glance-operator-controller-manager-64db6967f8-tfz6b\" (UID: \"b74999f3-cb46-4b35-a70f-71977b54d944\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-tfz6b" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.659604 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlstv\" (UniqueName: \"kubernetes.io/projected/89b0f1f9-11f1-4d01-a2b8-ca2f1fae3bb2-kube-api-access-hlstv\") pod \"cinder-operator-controller-manager-55d77d7b5c-7fdsl\" (UID: \"89b0f1f9-11f1-4d01-a2b8-ca2f1fae3bb2\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7fdsl" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.659630 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghkjg\" (UniqueName: \"kubernetes.io/projected/20b2fb83-c944-4553-b506-9ff3c9c199f5-kube-api-access-ghkjg\") pod \"designate-operator-controller-manager-5d87c9d997-r5d6p\" (UID: \"20b2fb83-c944-4553-b506-9ff3c9c199f5\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.683307 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-jhmcx"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.684794 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-jhmcx" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.700449 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk6dh\" (UniqueName: \"kubernetes.io/projected/d9dc8da2-0584-4db0-ad3a-f1c59c2f6028-kube-api-access-zk6dh\") pod \"barbican-operator-controller-manager-6db6876945-9jfp7\" (UID: \"d9dc8da2-0584-4db0-ad3a-f1c59c2f6028\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.700535 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-l45tk" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.703608 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghkjg\" (UniqueName: \"kubernetes.io/projected/20b2fb83-c944-4553-b506-9ff3c9c199f5-kube-api-access-ghkjg\") pod \"designate-operator-controller-manager-5d87c9d997-r5d6p\" (UID: \"20b2fb83-c944-4553-b506-9ff3c9c199f5\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.709313 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-jhmcx"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.710742 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlstv\" (UniqueName: \"kubernetes.io/projected/89b0f1f9-11f1-4d01-a2b8-ca2f1fae3bb2-kube-api-access-hlstv\") pod \"cinder-operator-controller-manager-55d77d7b5c-7fdsl\" (UID: \"89b0f1f9-11f1-4d01-a2b8-ca2f1fae3bb2\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7fdsl" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.720136 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.721096 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.726285 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9gr2t" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.726633 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.729220 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.742545 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.743494 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.749789 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-dvmf8" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.755844 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.757196 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.759222 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4wfjz" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.760623 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmt88\" (UniqueName: \"kubernetes.io/projected/b1140422-6cf3-4e92-95e2-6ea31179de28-kube-api-access-jmt88\") pod \"heat-operator-controller-manager-cf99c678f-lw2kp\" (UID: \"b1140422-6cf3-4e92-95e2-6ea31179de28\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.760754 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqmvh\" (UniqueName: \"kubernetes.io/projected/98ba9a2a-30d6-45f2-af47-2994c292fe05-kube-api-access-qqmvh\") pod \"horizon-operator-controller-manager-78bc7f9bd9-jhmcx\" (UID: \"98ba9a2a-30d6-45f2-af47-2994c292fe05\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-jhmcx" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.760834 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm2k9\" (UniqueName: \"kubernetes.io/projected/fe547e1c-cb50-4541-b867-5154dae69ec3-kube-api-access-mm2k9\") pod \"infra-operator-controller-manager-f7fcc58b9-swtlr\" (UID: \"fe547e1c-cb50-4541-b867-5154dae69ec3\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.760927 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcfht\" (UniqueName: \"kubernetes.io/projected/b74999f3-cb46-4b35-a70f-71977b54d944-kube-api-access-xcfht\") pod \"glance-operator-controller-manager-64db6967f8-tfz6b\" (UID: \"b74999f3-cb46-4b35-a70f-71977b54d944\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-tfz6b" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.761351 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-swtlr\" (UID: \"fe547e1c-cb50-4541-b867-5154dae69ec3\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.761433 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrgkt\" (UniqueName: \"kubernetes.io/projected/ac60ffe8-71d2-4ea1-bbc5-d377fc70d940-kube-api-access-zrgkt\") pod \"ironic-operator-controller-manager-545456dc4-x7n9b\" (UID: \"ac60ffe8-71d2-4ea1-bbc5-d377fc70d940\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.765690 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.788607 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.806657 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-4775c"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.807506 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4775c" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.814448 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-hljsw" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.817316 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.826924 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-4775c"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.836846 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7fdsl" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.854147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcfht\" (UniqueName: \"kubernetes.io/projected/b74999f3-cb46-4b35-a70f-71977b54d944-kube-api-access-xcfht\") pod \"glance-operator-controller-manager-64db6967f8-tfz6b\" (UID: \"b74999f3-cb46-4b35-a70f-71977b54d944\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-tfz6b" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.854314 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmt88\" (UniqueName: \"kubernetes.io/projected/b1140422-6cf3-4e92-95e2-6ea31179de28-kube-api-access-jmt88\") pod \"heat-operator-controller-manager-cf99c678f-lw2kp\" (UID: \"b1140422-6cf3-4e92-95e2-6ea31179de28\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.869275 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghq8t\" (UniqueName: \"kubernetes.io/projected/55f715a3-ef6e-40d8-9f9b-3100b2847b8d-kube-api-access-ghq8t\") pod \"keystone-operator-controller-manager-7c789f89c6-dktrj\" (UID: \"55f715a3-ef6e-40d8-9f9b-3100b2847b8d\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.869340 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72ksw\" (UniqueName: \"kubernetes.io/projected/2c678a62-a744-4384-8403-618b566ed91e-kube-api-access-72ksw\") pod \"manila-operator-controller-manager-67d996989d-4775c\" (UID: \"2c678a62-a744-4384-8403-618b566ed91e\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-4775c" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.869377 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqmvh\" (UniqueName: \"kubernetes.io/projected/98ba9a2a-30d6-45f2-af47-2994c292fe05-kube-api-access-qqmvh\") pod \"horizon-operator-controller-manager-78bc7f9bd9-jhmcx\" (UID: \"98ba9a2a-30d6-45f2-af47-2994c292fe05\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-jhmcx" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.869407 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm2k9\" (UniqueName: \"kubernetes.io/projected/fe547e1c-cb50-4541-b867-5154dae69ec3-kube-api-access-mm2k9\") pod \"infra-operator-controller-manager-f7fcc58b9-swtlr\" (UID: \"fe547e1c-cb50-4541-b867-5154dae69ec3\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.869451 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-swtlr\" (UID: \"fe547e1c-cb50-4541-b867-5154dae69ec3\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.869475 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrgkt\" (UniqueName: \"kubernetes.io/projected/ac60ffe8-71d2-4ea1-bbc5-d377fc70d940-kube-api-access-zrgkt\") pod \"ironic-operator-controller-manager-545456dc4-x7n9b\" (UID: \"ac60ffe8-71d2-4ea1-bbc5-d377fc70d940\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b" Mar 09 09:23:32 crc kubenswrapper[4792]: E0309 09:23:32.870227 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 09:23:32 crc kubenswrapper[4792]: E0309 09:23:32.870280 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert podName:fe547e1c-cb50-4541-b867-5154dae69ec3 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:33.370259472 +0000 UTC m=+978.400460224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert") pod "infra-operator-controller-manager-f7fcc58b9-swtlr" (UID: "fe547e1c-cb50-4541-b867-5154dae69ec3") : secret "infra-operator-webhook-server-cert" not found Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.875215 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.880185 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ckrbc"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.882646 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ckrbc" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.910289 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrgkt\" (UniqueName: \"kubernetes.io/projected/ac60ffe8-71d2-4ea1-bbc5-d377fc70d940-kube-api-access-zrgkt\") pod \"ironic-operator-controller-manager-545456dc4-x7n9b\" (UID: \"ac60ffe8-71d2-4ea1-bbc5-d377fc70d940\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.917639 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ckrbc"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.921999 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-s5qkp" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.923852 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqmvh\" (UniqueName: \"kubernetes.io/projected/98ba9a2a-30d6-45f2-af47-2994c292fe05-kube-api-access-qqmvh\") pod \"horizon-operator-controller-manager-78bc7f9bd9-jhmcx\" (UID: \"98ba9a2a-30d6-45f2-af47-2994c292fe05\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-jhmcx" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.929691 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-tfz6b" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.946229 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.947092 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.949635 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.956680 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm2k9\" (UniqueName: \"kubernetes.io/projected/fe547e1c-cb50-4541-b867-5154dae69ec3-kube-api-access-mm2k9\") pod \"infra-operator-controller-manager-f7fcc58b9-swtlr\" (UID: \"fe547e1c-cb50-4541-b867-5154dae69ec3\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.962353 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-4ff77" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.965490 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-5k4db"] Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.966408 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5k4db" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.972361 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g85vs\" (UniqueName: \"kubernetes.io/projected/e27b7b35-b064-4e02-99e6-cb34af5ff0e9-kube-api-access-g85vs\") pod \"mariadb-operator-controller-manager-7b6bfb6475-ckrbc\" (UID: \"e27b7b35-b064-4e02-99e6-cb34af5ff0e9\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ckrbc" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.972421 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghq8t\" (UniqueName: \"kubernetes.io/projected/55f715a3-ef6e-40d8-9f9b-3100b2847b8d-kube-api-access-ghq8t\") pod \"keystone-operator-controller-manager-7c789f89c6-dktrj\" (UID: \"55f715a3-ef6e-40d8-9f9b-3100b2847b8d\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.972461 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72ksw\" (UniqueName: \"kubernetes.io/projected/2c678a62-a744-4384-8403-618b566ed91e-kube-api-access-72ksw\") pod \"manila-operator-controller-manager-67d996989d-4775c\" (UID: \"2c678a62-a744-4384-8403-618b566ed91e\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-4775c" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.972544 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5gg2\" (UniqueName: \"kubernetes.io/projected/9063ee68-9840-4f35-8d4d-44ab947477d5-kube-api-access-l5gg2\") pod \"nova-operator-controller-manager-74b6b5dc96-dpvjg\" (UID: \"9063ee68-9840-4f35-8d4d-44ab947477d5\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.973545 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-wbzxw" Mar 09 09:23:32 crc kubenswrapper[4792]: I0309 09:23:32.982018 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.049830 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghq8t\" (UniqueName: \"kubernetes.io/projected/55f715a3-ef6e-40d8-9f9b-3100b2847b8d-kube-api-access-ghq8t\") pod \"keystone-operator-controller-manager-7c789f89c6-dktrj\" (UID: \"55f715a3-ef6e-40d8-9f9b-3100b2847b8d\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.077414 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xrbz\" (UniqueName: \"kubernetes.io/projected/8fd39edc-ff27-4feb-b138-ee11a440c0ca-kube-api-access-4xrbz\") pod \"neutron-operator-controller-manager-54688575f-5k4db\" (UID: \"8fd39edc-ff27-4feb-b138-ee11a440c0ca\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-5k4db" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.077492 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5gg2\" (UniqueName: \"kubernetes.io/projected/9063ee68-9840-4f35-8d4d-44ab947477d5-kube-api-access-l5gg2\") pod \"nova-operator-controller-manager-74b6b5dc96-dpvjg\" (UID: \"9063ee68-9840-4f35-8d4d-44ab947477d5\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.077570 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g85vs\" (UniqueName: \"kubernetes.io/projected/e27b7b35-b064-4e02-99e6-cb34af5ff0e9-kube-api-access-g85vs\") pod \"mariadb-operator-controller-manager-7b6bfb6475-ckrbc\" (UID: \"e27b7b35-b064-4e02-99e6-cb34af5ff0e9\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ckrbc" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.079536 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-jhmcx" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.080397 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72ksw\" (UniqueName: \"kubernetes.io/projected/2c678a62-a744-4384-8403-618b566ed91e-kube-api-access-72ksw\") pod \"manila-operator-controller-manager-67d996989d-4775c\" (UID: \"2c678a62-a744-4384-8403-618b566ed91e\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-4775c" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.100475 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.113091 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.144880 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.149497 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.151105 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-5k4db"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.151372 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-l2b28" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.151702 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5gg2\" (UniqueName: \"kubernetes.io/projected/9063ee68-9840-4f35-8d4d-44ab947477d5-kube-api-access-l5gg2\") pod \"nova-operator-controller-manager-74b6b5dc96-dpvjg\" (UID: \"9063ee68-9840-4f35-8d4d-44ab947477d5\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.163110 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.190860 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckmnn\" (UniqueName: \"kubernetes.io/projected/c28488b2-919b-4307-9a70-b2f5f1280e2a-kube-api-access-ckmnn\") pod \"octavia-operator-controller-manager-5d86c7ddb7-r44dt\" (UID: \"c28488b2-919b-4307-9a70-b2f5f1280e2a\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.191344 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xrbz\" (UniqueName: \"kubernetes.io/projected/8fd39edc-ff27-4feb-b138-ee11a440c0ca-kube-api-access-4xrbz\") pod \"neutron-operator-controller-manager-54688575f-5k4db\" (UID: \"8fd39edc-ff27-4feb-b138-ee11a440c0ca\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-5k4db" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.197373 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g85vs\" (UniqueName: \"kubernetes.io/projected/e27b7b35-b064-4e02-99e6-cb34af5ff0e9-kube-api-access-g85vs\") pod \"mariadb-operator-controller-manager-7b6bfb6475-ckrbc\" (UID: \"e27b7b35-b064-4e02-99e6-cb34af5ff0e9\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ckrbc" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.206448 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4775c" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.242337 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.244060 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.254948 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2fnnb" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.263299 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xrbz\" (UniqueName: \"kubernetes.io/projected/8fd39edc-ff27-4feb-b138-ee11a440c0ca-kube-api-access-4xrbz\") pod \"neutron-operator-controller-manager-54688575f-5k4db\" (UID: \"8fd39edc-ff27-4feb-b138-ee11a440c0ca\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-5k4db" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.269350 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.277708 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ckrbc" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.293653 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckmnn\" (UniqueName: \"kubernetes.io/projected/c28488b2-919b-4307-9a70-b2f5f1280e2a-kube-api-access-ckmnn\") pod \"octavia-operator-controller-manager-5d86c7ddb7-r44dt\" (UID: \"c28488b2-919b-4307-9a70-b2f5f1280e2a\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.293717 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4\" (UID: \"9ca7aa92-3367-4c2e-a86e-33ba41fe81cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.293754 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slhmm\" (UniqueName: \"kubernetes.io/projected/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-kube-api-access-slhmm\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4\" (UID: \"9ca7aa92-3367-4c2e-a86e-33ba41fe81cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.350365 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckmnn\" (UniqueName: \"kubernetes.io/projected/c28488b2-919b-4307-9a70-b2f5f1280e2a-kube-api-access-ckmnn\") pod \"octavia-operator-controller-manager-5d86c7ddb7-r44dt\" (UID: \"c28488b2-919b-4307-9a70-b2f5f1280e2a\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.358856 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.359402 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.360954 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.364879 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-j8f5f" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.382286 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5k4db" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.394576 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4\" (UID: \"9ca7aa92-3367-4c2e-a86e-33ba41fe81cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.394642 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slhmm\" (UniqueName: \"kubernetes.io/projected/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-kube-api-access-slhmm\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4\" (UID: \"9ca7aa92-3367-4c2e-a86e-33ba41fe81cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.394668 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j867v\" (UniqueName: \"kubernetes.io/projected/82689eba-1f75-4e2e-8c27-a5b90e2805af-kube-api-access-j867v\") pod \"ovn-operator-controller-manager-75684d597f-8vt8s\" (UID: \"82689eba-1f75-4e2e-8c27-a5b90e2805af\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.394697 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-swtlr\" (UID: \"fe547e1c-cb50-4541-b867-5154dae69ec3\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:23:33 crc kubenswrapper[4792]: E0309 09:23:33.394849 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 09:23:33 crc kubenswrapper[4792]: E0309 09:23:33.394897 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert podName:fe547e1c-cb50-4541-b867-5154dae69ec3 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:34.394881378 +0000 UTC m=+979.425082130 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert") pod "infra-operator-controller-manager-f7fcc58b9-swtlr" (UID: "fe547e1c-cb50-4541-b867-5154dae69ec3") : secret "infra-operator-webhook-server-cert" not found Mar 09 09:23:33 crc kubenswrapper[4792]: E0309 09:23:33.395574 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:23:33 crc kubenswrapper[4792]: E0309 09:23:33.395609 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert podName:9ca7aa92-3367-4c2e-a86e-33ba41fe81cb nodeName:}" failed. No retries permitted until 2026-03-09 09:23:33.895601969 +0000 UTC m=+978.925802721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" (UID: "9ca7aa92-3367-4c2e-a86e-33ba41fe81cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.438540 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slhmm\" (UniqueName: \"kubernetes.io/projected/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-kube-api-access-slhmm\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4\" (UID: \"9ca7aa92-3367-4c2e-a86e-33ba41fe81cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.443379 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.452410 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.462217 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.463116 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.465330 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-j8fhm" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.471371 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.485802 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.486887 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.492556 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-jdbbp" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.495847 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcdvf\" (UniqueName: \"kubernetes.io/projected/d53acf43-fee2-4bdf-9cdb-883641a56d48-kube-api-access-dcdvf\") pod \"placement-operator-controller-manager-648564c9fc-z5tts\" (UID: \"d53acf43-fee2-4bdf-9cdb-883641a56d48\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.495951 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j867v\" (UniqueName: \"kubernetes.io/projected/82689eba-1f75-4e2e-8c27-a5b90e2805af-kube-api-access-j867v\") pod \"ovn-operator-controller-manager-75684d597f-8vt8s\" (UID: \"82689eba-1f75-4e2e-8c27-a5b90e2805af\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.510505 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.516445 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.517364 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.521752 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hrgkl" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.542127 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j867v\" (UniqueName: \"kubernetes.io/projected/82689eba-1f75-4e2e-8c27-a5b90e2805af-kube-api-access-j867v\") pod \"ovn-operator-controller-manager-75684d597f-8vt8s\" (UID: \"82689eba-1f75-4e2e-8c27-a5b90e2805af\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.560024 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.594035 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.604137 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkbrz\" (UniqueName: \"kubernetes.io/projected/d4313901-b530-42e8-a975-d21aefbc0506-kube-api-access-bkbrz\") pod \"telemetry-operator-controller-manager-5fdb694969-mzpqx\" (UID: \"d4313901-b530-42e8-a975-d21aefbc0506\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.604192 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4hls\" (UniqueName: \"kubernetes.io/projected/533287c3-78f0-46ea-baa9-fafb1ce7615b-kube-api-access-f4hls\") pod \"swift-operator-controller-manager-9b9ff9f4d-vhf7p\" (UID: \"533287c3-78f0-46ea-baa9-fafb1ce7615b\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.604256 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcdvf\" (UniqueName: \"kubernetes.io/projected/d53acf43-fee2-4bdf-9cdb-883641a56d48-kube-api-access-dcdvf\") pod \"placement-operator-controller-manager-648564c9fc-z5tts\" (UID: \"d53acf43-fee2-4bdf-9cdb-883641a56d48\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.636234 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.640384 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.643422 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-cc9lq" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.712848 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcdvf\" (UniqueName: \"kubernetes.io/projected/d53acf43-fee2-4bdf-9cdb-883641a56d48-kube-api-access-dcdvf\") pod \"placement-operator-controller-manager-648564c9fc-z5tts\" (UID: \"d53acf43-fee2-4bdf-9cdb-883641a56d48\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.713337 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkbrz\" (UniqueName: \"kubernetes.io/projected/d4313901-b530-42e8-a975-d21aefbc0506-kube-api-access-bkbrz\") pod \"telemetry-operator-controller-manager-5fdb694969-mzpqx\" (UID: \"d4313901-b530-42e8-a975-d21aefbc0506\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.713378 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4hls\" (UniqueName: \"kubernetes.io/projected/533287c3-78f0-46ea-baa9-fafb1ce7615b-kube-api-access-f4hls\") pod \"swift-operator-controller-manager-9b9ff9f4d-vhf7p\" (UID: \"533287c3-78f0-46ea-baa9-fafb1ce7615b\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.713409 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkgff\" (UniqueName: \"kubernetes.io/projected/e56405f7-7121-4d52-b276-3feeddabd667-kube-api-access-zkgff\") pod \"test-operator-controller-manager-55b5ff4dbb-z4lgh\" (UID: \"e56405f7-7121-4d52-b276-3feeddabd667\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.753253 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.782817 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.783484 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.783502 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.783515 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.784020 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.784404 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.796224 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.796389 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.796496 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-72dh5" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.796591 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7ts8l" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.798201 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkbrz\" (UniqueName: \"kubernetes.io/projected/d4313901-b530-42e8-a975-d21aefbc0506-kube-api-access-bkbrz\") pod \"telemetry-operator-controller-manager-5fdb694969-mzpqx\" (UID: \"d4313901-b530-42e8-a975-d21aefbc0506\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.807529 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.814924 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jp2l\" (UniqueName: \"kubernetes.io/projected/e42c0d5f-7c0c-420f-a14b-59316b524101-kube-api-access-9jp2l\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.814973 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.815052 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkgff\" (UniqueName: \"kubernetes.io/projected/e56405f7-7121-4d52-b276-3feeddabd667-kube-api-access-zkgff\") pod \"test-operator-controller-manager-55b5ff4dbb-z4lgh\" (UID: \"e56405f7-7121-4d52-b276-3feeddabd667\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.815108 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.815132 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pr76\" (UniqueName: \"kubernetes.io/projected/41f3c31e-77a7-4912-a933-04b32c0db0dc-kube-api-access-8pr76\") pod \"watcher-operator-controller-manager-bccc79885-vj8ds\" (UID: \"41f3c31e-77a7-4912-a933-04b32c0db0dc\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.822188 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4hls\" (UniqueName: \"kubernetes.io/projected/533287c3-78f0-46ea-baa9-fafb1ce7615b-kube-api-access-f4hls\") pod \"swift-operator-controller-manager-9b9ff9f4d-vhf7p\" (UID: \"533287c3-78f0-46ea-baa9-fafb1ce7615b\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.843308 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.887843 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkgff\" (UniqueName: \"kubernetes.io/projected/e56405f7-7121-4d52-b276-3feeddabd667-kube-api-access-zkgff\") pod \"test-operator-controller-manager-55b5ff4dbb-z4lgh\" (UID: \"e56405f7-7121-4d52-b276-3feeddabd667\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.880044 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kjtwp"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.893388 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kjtwp" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.917164 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.921160 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jp2l\" (UniqueName: \"kubernetes.io/projected/e42c0d5f-7c0c-420f-a14b-59316b524101-kube-api-access-9jp2l\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.921226 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.921282 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4\" (UID: \"9ca7aa92-3367-4c2e-a86e-33ba41fe81cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.921431 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.921489 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pr76\" (UniqueName: \"kubernetes.io/projected/41f3c31e-77a7-4912-a933-04b32c0db0dc-kube-api-access-8pr76\") pod \"watcher-operator-controller-manager-bccc79885-vj8ds\" (UID: \"41f3c31e-77a7-4912-a933-04b32c0db0dc\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds" Mar 09 09:23:33 crc kubenswrapper[4792]: E0309 09:23:33.928325 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 09:23:33 crc kubenswrapper[4792]: E0309 09:23:33.928433 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs podName:e42c0d5f-7c0c-420f-a14b-59316b524101 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:34.428399195 +0000 UTC m=+979.458599937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-qh4rf" (UID: "e42c0d5f-7c0c-420f-a14b-59316b524101") : secret "webhook-server-cert" not found Mar 09 09:23:33 crc kubenswrapper[4792]: E0309 09:23:33.936118 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:23:33 crc kubenswrapper[4792]: E0309 09:23:33.936225 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert podName:9ca7aa92-3367-4c2e-a86e-33ba41fe81cb nodeName:}" failed. No retries permitted until 2026-03-09 09:23:34.936195163 +0000 UTC m=+979.966395915 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" (UID: "9ca7aa92-3367-4c2e-a86e-33ba41fe81cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:23:33 crc kubenswrapper[4792]: E0309 09:23:33.937298 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 09:23:33 crc kubenswrapper[4792]: E0309 09:23:33.937341 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs podName:e42c0d5f-7c0c-420f-a14b-59316b524101 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:34.437330467 +0000 UTC m=+979.467531219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-qh4rf" (UID: "e42c0d5f-7c0c-420f-a14b-59316b524101") : secret "metrics-server-cert" not found Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.947414 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.953812 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-wlrrk" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.969146 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kjtwp"] Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.978546 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jp2l\" (UniqueName: \"kubernetes.io/projected/e42c0d5f-7c0c-420f-a14b-59316b524101-kube-api-access-9jp2l\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:33 crc kubenswrapper[4792]: I0309 09:23:33.988987 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh" Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.006326 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7"] Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.007178 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pr76\" (UniqueName: \"kubernetes.io/projected/41f3c31e-77a7-4912-a933-04b32c0db0dc-kube-api-access-8pr76\") pod \"watcher-operator-controller-manager-bccc79885-vj8ds\" (UID: \"41f3c31e-77a7-4912-a933-04b32c0db0dc\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds" Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.023443 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9p8x\" (UniqueName: \"kubernetes.io/projected/92a6c902-5189-421e-b1a1-ed3e64e7bca4-kube-api-access-x9p8x\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kjtwp\" (UID: \"92a6c902-5189-421e-b1a1-ed3e64e7bca4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kjtwp" Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.125890 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9p8x\" (UniqueName: \"kubernetes.io/projected/92a6c902-5189-421e-b1a1-ed3e64e7bca4-kube-api-access-x9p8x\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kjtwp\" (UID: \"92a6c902-5189-421e-b1a1-ed3e64e7bca4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kjtwp" Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.160851 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9p8x\" (UniqueName: \"kubernetes.io/projected/92a6c902-5189-421e-b1a1-ed3e64e7bca4-kube-api-access-x9p8x\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kjtwp\" (UID: \"92a6c902-5189-421e-b1a1-ed3e64e7bca4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kjtwp" Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.299416 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds" Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.333050 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7" event={"ID":"d9dc8da2-0584-4db0-ad3a-f1c59c2f6028","Type":"ContainerStarted","Data":"81547ab474d689133515339bc25d40e3de029446278dcb4f66b984df3ccf48da"} Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.400045 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7fdsl"] Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.404267 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kjtwp" Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.442449 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:34 crc kubenswrapper[4792]: E0309 09:23:34.442586 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 09:23:34 crc kubenswrapper[4792]: E0309 09:23:34.442722 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs podName:e42c0d5f-7c0c-420f-a14b-59316b524101 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:35.442706261 +0000 UTC m=+980.472907013 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-qh4rf" (UID: "e42c0d5f-7c0c-420f-a14b-59316b524101") : secret "webhook-server-cert" not found Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.443103 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-swtlr\" (UID: \"fe547e1c-cb50-4541-b867-5154dae69ec3\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:23:34 crc kubenswrapper[4792]: E0309 09:23:34.443188 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 09:23:34 crc kubenswrapper[4792]: E0309 09:23:34.443218 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert podName:fe547e1c-cb50-4541-b867-5154dae69ec3 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:36.443210595 +0000 UTC m=+981.473411347 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert") pod "infra-operator-controller-manager-f7fcc58b9-swtlr" (UID: "fe547e1c-cb50-4541-b867-5154dae69ec3") : secret "infra-operator-webhook-server-cert" not found Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.443250 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:34 crc kubenswrapper[4792]: E0309 09:23:34.443331 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 09:23:34 crc kubenswrapper[4792]: E0309 09:23:34.443363 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs podName:e42c0d5f-7c0c-420f-a14b-59316b524101 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:35.443356919 +0000 UTC m=+980.473557671 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-qh4rf" (UID: "e42c0d5f-7c0c-420f-a14b-59316b524101") : secret "metrics-server-cert" not found Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.529514 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj"] Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.645786 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p"] Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.836417 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-tfz6b"] Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.906458 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp"] Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.932560 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b"] Mar 09 09:23:34 crc kubenswrapper[4792]: W0309 09:23:34.948935 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1140422_6cf3_4e92_95e2_6ea31179de28.slice/crio-0e5f9d13f3bf1f5f1cd067c19f0437979f2b5233a872c7d34a2dda145f6a6b4a WatchSource:0}: Error finding container 0e5f9d13f3bf1f5f1cd067c19f0437979f2b5233a872c7d34a2dda145f6a6b4a: Status 404 returned error can't find the container with id 0e5f9d13f3bf1f5f1cd067c19f0437979f2b5233a872c7d34a2dda145f6a6b4a Mar 09 09:23:34 crc kubenswrapper[4792]: W0309 09:23:34.961223 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac60ffe8_71d2_4ea1_bbc5_d377fc70d940.slice/crio-4605ef5b1dbb7df25f410027cda62bc176fc985d8d5d51bba8c93019c6e6ca4c WatchSource:0}: Error finding container 4605ef5b1dbb7df25f410027cda62bc176fc985d8d5d51bba8c93019c6e6ca4c: Status 404 returned error can't find the container with id 4605ef5b1dbb7df25f410027cda62bc176fc985d8d5d51bba8c93019c6e6ca4c Mar 09 09:23:34 crc kubenswrapper[4792]: I0309 09:23:34.963820 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4\" (UID: \"9ca7aa92-3367-4c2e-a86e-33ba41fe81cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:23:34 crc kubenswrapper[4792]: E0309 09:23:34.964280 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:23:34 crc kubenswrapper[4792]: E0309 09:23:34.964331 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert podName:9ca7aa92-3367-4c2e-a86e-33ba41fe81cb nodeName:}" failed. No retries permitted until 2026-03-09 09:23:36.964313149 +0000 UTC m=+981.994513901 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" (UID: "9ca7aa92-3367-4c2e-a86e-33ba41fe81cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.061279 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-jhmcx"] Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.081703 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ckrbc"] Mar 09 09:23:35 crc kubenswrapper[4792]: W0309 09:23:35.095506 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98ba9a2a_30d6_45f2_af47_2994c292fe05.slice/crio-dd80f28952769ba9a02a3c2241115ce93c9733df7f3491191619243576e24634 WatchSource:0}: Error finding container dd80f28952769ba9a02a3c2241115ce93c9733df7f3491191619243576e24634: Status 404 returned error can't find the container with id dd80f28952769ba9a02a3c2241115ce93c9733df7f3491191619243576e24634 Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.191314 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-5k4db"] Mar 09 09:23:35 crc kubenswrapper[4792]: W0309 09:23:35.199115 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fd39edc_ff27_4feb_b138_ee11a440c0ca.slice/crio-e59d635387ddb5c6f5531f2edc6d0eeae83fc0b88ded775ba5d884e5e8672fa4 WatchSource:0}: Error finding container e59d635387ddb5c6f5531f2edc6d0eeae83fc0b88ded775ba5d884e5e8672fa4: Status 404 returned error can't find the container with id e59d635387ddb5c6f5531f2edc6d0eeae83fc0b88ded775ba5d884e5e8672fa4 Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.284825 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-4775c"] Mar 09 09:23:35 crc kubenswrapper[4792]: W0309 09:23:35.302361 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c678a62_a744_4384_8403_618b566ed91e.slice/crio-3053c2cb41696740e1019a7ff80b6d91c9e54f98df627308a954703ff7b272b7 WatchSource:0}: Error finding container 3053c2cb41696740e1019a7ff80b6d91c9e54f98df627308a954703ff7b272b7: Status 404 returned error can't find the container with id 3053c2cb41696740e1019a7ff80b6d91c9e54f98df627308a954703ff7b272b7 Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.347211 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj" event={"ID":"55f715a3-ef6e-40d8-9f9b-3100b2847b8d","Type":"ContainerStarted","Data":"d6d77678bbf9e9a3882fcdea502e983eecd118ecf810391f1b070a7b44584fa2"} Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.349722 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-jhmcx" event={"ID":"98ba9a2a-30d6-45f2-af47-2994c292fe05","Type":"ContainerStarted","Data":"dd80f28952769ba9a02a3c2241115ce93c9733df7f3491191619243576e24634"} Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.351127 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4775c" event={"ID":"2c678a62-a744-4384-8403-618b566ed91e","Type":"ContainerStarted","Data":"3053c2cb41696740e1019a7ff80b6d91c9e54f98df627308a954703ff7b272b7"} Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.356553 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7fdsl" event={"ID":"89b0f1f9-11f1-4d01-a2b8-ca2f1fae3bb2","Type":"ContainerStarted","Data":"792e6943c5e2c5da5386891bb624f35b571e590acb3fbac5883db2946278bfc0"} Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.368099 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p"] Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.372400 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5k4db" event={"ID":"8fd39edc-ff27-4feb-b138-ee11a440c0ca","Type":"ContainerStarted","Data":"e59d635387ddb5c6f5531f2edc6d0eeae83fc0b88ded775ba5d884e5e8672fa4"} Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.373351 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b" event={"ID":"ac60ffe8-71d2-4ea1-bbc5-d377fc70d940","Type":"ContainerStarted","Data":"4605ef5b1dbb7df25f410027cda62bc176fc985d8d5d51bba8c93019c6e6ca4c"} Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.374442 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p" event={"ID":"20b2fb83-c944-4553-b506-9ff3c9c199f5","Type":"ContainerStarted","Data":"a5beccaa75d8dfac8cc582e34ea56389789af779bbd4dd09895cd8f66c23282c"} Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.376443 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ckrbc" event={"ID":"e27b7b35-b064-4e02-99e6-cb34af5ff0e9","Type":"ContainerStarted","Data":"7ad6462b6ecda6dfb468edcb7b776f6a99af4b4cc279799e02ac52c7758c6e8c"} Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.394943 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-tfz6b" event={"ID":"b74999f3-cb46-4b35-a70f-71977b54d944","Type":"ContainerStarted","Data":"57213bf5cee75486b104a64992c85c4c40d53fcefe7b8f98019e64a9054fa257"} Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.404114 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg"] Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.412105 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt"] Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.418846 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp" event={"ID":"b1140422-6cf3-4e92-95e2-6ea31179de28","Type":"ContainerStarted","Data":"0e5f9d13f3bf1f5f1cd067c19f0437979f2b5233a872c7d34a2dda145f6a6b4a"} Mar 09 09:23:35 crc kubenswrapper[4792]: W0309 09:23:35.424137 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod533287c3_78f0_46ea_baa9_fafb1ce7615b.slice/crio-2534c7187522724aa26b43b27dbde0ae8022099f9260b32eb4cd8c6ecc3be9cb WatchSource:0}: Error finding container 2534c7187522724aa26b43b27dbde0ae8022099f9260b32eb4cd8c6ecc3be9cb: Status 404 returned error can't find the container with id 2534c7187522724aa26b43b27dbde0ae8022099f9260b32eb4cd8c6ecc3be9cb Mar 09 09:23:35 crc kubenswrapper[4792]: W0309 09:23:35.436384 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc28488b2_919b_4307_9a70_b2f5f1280e2a.slice/crio-c5619c7402cb221e25c7ac274e8f80b32dc189a5fb305c6076c53246790d9cce WatchSource:0}: Error finding container c5619c7402cb221e25c7ac274e8f80b32dc189a5fb305c6076c53246790d9cce: Status 404 returned error can't find the container with id c5619c7402cb221e25c7ac274e8f80b32dc189a5fb305c6076c53246790d9cce Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.474865 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.474948 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:35 crc kubenswrapper[4792]: E0309 09:23:35.475356 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 09:23:35 crc kubenswrapper[4792]: E0309 09:23:35.475403 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs podName:e42c0d5f-7c0c-420f-a14b-59316b524101 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:37.47538975 +0000 UTC m=+982.505590502 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-qh4rf" (UID: "e42c0d5f-7c0c-420f-a14b-59316b524101") : secret "webhook-server-cert" not found Mar 09 09:23:35 crc kubenswrapper[4792]: E0309 09:23:35.475878 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 09:23:35 crc kubenswrapper[4792]: E0309 09:23:35.476027 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs podName:e42c0d5f-7c0c-420f-a14b-59316b524101 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:37.475999158 +0000 UTC m=+982.506199910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-qh4rf" (UID: "e42c0d5f-7c0c-420f-a14b-59316b524101") : secret "metrics-server-cert" not found Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.528167 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kjtwp"] Mar 09 09:23:35 crc kubenswrapper[4792]: W0309 09:23:35.548215 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92a6c902_5189_421e_b1a1_ed3e64e7bca4.slice/crio-0ce4c98acd7e75d84efda953c119941d65075e41d58a3cfa53309123fa9fbf7e WatchSource:0}: Error finding container 0ce4c98acd7e75d84efda953c119941d65075e41d58a3cfa53309123fa9fbf7e: Status 404 returned error can't find the container with id 0ce4c98acd7e75d84efda953c119941d65075e41d58a3cfa53309123fa9fbf7e Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.556639 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s"] Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.558753 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts"] Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.587776 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds"] Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.599335 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx"] Mar 09 09:23:35 crc kubenswrapper[4792]: I0309 09:23:35.609105 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh"] Mar 09 09:23:35 crc kubenswrapper[4792]: E0309 09:23:35.618866 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bkbrz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5fdb694969-mzpqx_openstack-operators(d4313901-b530-42e8-a975-d21aefbc0506): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 09:23:35 crc kubenswrapper[4792]: E0309 09:23:35.619277 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j867v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-75684d597f-8vt8s_openstack-operators(82689eba-1f75-4e2e-8c27-a5b90e2805af): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 09:23:35 crc kubenswrapper[4792]: E0309 09:23:35.620243 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" podUID="d4313901-b530-42e8-a975-d21aefbc0506" Mar 09 09:23:35 crc kubenswrapper[4792]: E0309 09:23:35.620447 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s" podUID="82689eba-1f75-4e2e-8c27-a5b90e2805af" Mar 09 09:23:35 crc kubenswrapper[4792]: E0309 09:23:35.625133 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dcdvf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-z5tts_openstack-operators(d53acf43-fee2-4bdf-9cdb-883641a56d48): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 09:23:35 crc kubenswrapper[4792]: E0309 09:23:35.626368 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts" podUID="d53acf43-fee2-4bdf-9cdb-883641a56d48" Mar 09 09:23:35 crc kubenswrapper[4792]: W0309 09:23:35.646517 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode56405f7_7121_4d52_b276_3feeddabd667.slice/crio-38e64d62f179a55470335918842f57c91997c98d6f9c9cf30d90b32c2e848743 WatchSource:0}: Error finding container 38e64d62f179a55470335918842f57c91997c98d6f9c9cf30d90b32c2e848743: Status 404 returned error can't find the container with id 38e64d62f179a55470335918842f57c91997c98d6f9c9cf30d90b32c2e848743 Mar 09 09:23:35 crc kubenswrapper[4792]: E0309 09:23:35.655219 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zkgff,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-55b5ff4dbb-z4lgh_openstack-operators(e56405f7-7121-4d52-b276-3feeddabd667): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 09:23:35 crc kubenswrapper[4792]: E0309 09:23:35.657135 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh" podUID="e56405f7-7121-4d52-b276-3feeddabd667" Mar 09 09:23:36 crc kubenswrapper[4792]: I0309 09:23:36.442095 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts" event={"ID":"d53acf43-fee2-4bdf-9cdb-883641a56d48","Type":"ContainerStarted","Data":"1222b091d2ad9dcd834022419b68fbc197b9737ca2d2a64cbd375003e854a8d7"} Mar 09 09:23:36 crc kubenswrapper[4792]: I0309 09:23:36.445590 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kjtwp" event={"ID":"92a6c902-5189-421e-b1a1-ed3e64e7bca4","Type":"ContainerStarted","Data":"0ce4c98acd7e75d84efda953c119941d65075e41d58a3cfa53309123fa9fbf7e"} Mar 09 09:23:36 crc kubenswrapper[4792]: E0309 09:23:36.450206 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts" podUID="d53acf43-fee2-4bdf-9cdb-883641a56d48" Mar 09 09:23:36 crc kubenswrapper[4792]: I0309 09:23:36.455415 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s" event={"ID":"82689eba-1f75-4e2e-8c27-a5b90e2805af","Type":"ContainerStarted","Data":"8686e067a0742e3ccf3e4a63aa246807b9e457ca81f50abfe2411196742a8df7"} Mar 09 09:23:36 crc kubenswrapper[4792]: E0309 09:23:36.456964 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s" podUID="82689eba-1f75-4e2e-8c27-a5b90e2805af" Mar 09 09:23:36 crc kubenswrapper[4792]: I0309 09:23:36.462278 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds" event={"ID":"41f3c31e-77a7-4912-a933-04b32c0db0dc","Type":"ContainerStarted","Data":"4426acfbc6684db01952ac4747099d4202798b796c9d1255a77b89e1ddb327eb"} Mar 09 09:23:36 crc kubenswrapper[4792]: I0309 09:23:36.481555 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p" event={"ID":"533287c3-78f0-46ea-baa9-fafb1ce7615b","Type":"ContainerStarted","Data":"2534c7187522724aa26b43b27dbde0ae8022099f9260b32eb4cd8c6ecc3be9cb"} Mar 09 09:23:36 crc kubenswrapper[4792]: I0309 09:23:36.493971 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh" event={"ID":"e56405f7-7121-4d52-b276-3feeddabd667","Type":"ContainerStarted","Data":"38e64d62f179a55470335918842f57c91997c98d6f9c9cf30d90b32c2e848743"} Mar 09 09:23:36 crc kubenswrapper[4792]: I0309 09:23:36.504957 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-swtlr\" (UID: \"fe547e1c-cb50-4541-b867-5154dae69ec3\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:23:36 crc kubenswrapper[4792]: E0309 09:23:36.505354 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 09:23:36 crc kubenswrapper[4792]: E0309 09:23:36.505397 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert podName:fe547e1c-cb50-4541-b867-5154dae69ec3 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:40.505383531 +0000 UTC m=+985.535584283 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert") pod "infra-operator-controller-manager-f7fcc58b9-swtlr" (UID: "fe547e1c-cb50-4541-b867-5154dae69ec3") : secret "infra-operator-webhook-server-cert" not found Mar 09 09:23:36 crc kubenswrapper[4792]: E0309 09:23:36.521499 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh" podUID="e56405f7-7121-4d52-b276-3feeddabd667" Mar 09 09:23:36 crc kubenswrapper[4792]: I0309 09:23:36.525749 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg" event={"ID":"9063ee68-9840-4f35-8d4d-44ab947477d5","Type":"ContainerStarted","Data":"2a2dab8ecb29983304cd8318bc1c99428ce7c0d444bd2cdc4c227a223b1adf19"} Mar 09 09:23:36 crc kubenswrapper[4792]: I0309 09:23:36.558817 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" event={"ID":"d4313901-b530-42e8-a975-d21aefbc0506","Type":"ContainerStarted","Data":"805c3748fcf17387a323751bf1a4a8e987322bcd30e1df39a3bf06f2e4835adf"} Mar 09 09:23:36 crc kubenswrapper[4792]: E0309 09:23:36.567558 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" podUID="d4313901-b530-42e8-a975-d21aefbc0506" Mar 09 09:23:36 crc kubenswrapper[4792]: I0309 09:23:36.569131 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt" event={"ID":"c28488b2-919b-4307-9a70-b2f5f1280e2a","Type":"ContainerStarted","Data":"c5619c7402cb221e25c7ac274e8f80b32dc189a5fb305c6076c53246790d9cce"} Mar 09 09:23:37 crc kubenswrapper[4792]: I0309 09:23:37.022397 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4\" (UID: \"9ca7aa92-3367-4c2e-a86e-33ba41fe81cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:23:37 crc kubenswrapper[4792]: E0309 09:23:37.022858 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:23:37 crc kubenswrapper[4792]: E0309 09:23:37.022982 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert podName:9ca7aa92-3367-4c2e-a86e-33ba41fe81cb nodeName:}" failed. No retries permitted until 2026-03-09 09:23:41.022954421 +0000 UTC m=+986.053155173 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" (UID: "9ca7aa92-3367-4c2e-a86e-33ba41fe81cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:23:37 crc kubenswrapper[4792]: I0309 09:23:37.538222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:37 crc kubenswrapper[4792]: E0309 09:23:37.538321 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 09:23:37 crc kubenswrapper[4792]: I0309 09:23:37.538344 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:37 crc kubenswrapper[4792]: E0309 09:23:37.538374 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs podName:e42c0d5f-7c0c-420f-a14b-59316b524101 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:41.538358809 +0000 UTC m=+986.568559561 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-qh4rf" (UID: "e42c0d5f-7c0c-420f-a14b-59316b524101") : secret "webhook-server-cert" not found Mar 09 09:23:37 crc kubenswrapper[4792]: E0309 09:23:37.538445 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 09:23:37 crc kubenswrapper[4792]: E0309 09:23:37.538480 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs podName:e42c0d5f-7c0c-420f-a14b-59316b524101 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:41.538470992 +0000 UTC m=+986.568671744 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-qh4rf" (UID: "e42c0d5f-7c0c-420f-a14b-59316b524101") : secret "metrics-server-cert" not found Mar 09 09:23:37 crc kubenswrapper[4792]: E0309 09:23:37.592165 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts" podUID="d53acf43-fee2-4bdf-9cdb-883641a56d48" Mar 09 09:23:37 crc kubenswrapper[4792]: E0309 09:23:37.592255 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s" podUID="82689eba-1f75-4e2e-8c27-a5b90e2805af" Mar 09 09:23:37 crc kubenswrapper[4792]: E0309 09:23:37.592292 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" podUID="d4313901-b530-42e8-a975-d21aefbc0506" Mar 09 09:23:37 crc kubenswrapper[4792]: E0309 09:23:37.592577 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh" podUID="e56405f7-7121-4d52-b276-3feeddabd667" Mar 09 09:23:40 crc kubenswrapper[4792]: I0309 09:23:40.600986 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-swtlr\" (UID: \"fe547e1c-cb50-4541-b867-5154dae69ec3\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:23:40 crc kubenswrapper[4792]: E0309 09:23:40.601213 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 09:23:40 crc kubenswrapper[4792]: E0309 09:23:40.601305 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert podName:fe547e1c-cb50-4541-b867-5154dae69ec3 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:48.60128239 +0000 UTC m=+993.631483142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert") pod "infra-operator-controller-manager-f7fcc58b9-swtlr" (UID: "fe547e1c-cb50-4541-b867-5154dae69ec3") : secret "infra-operator-webhook-server-cert" not found Mar 09 09:23:41 crc kubenswrapper[4792]: I0309 09:23:41.108498 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4\" (UID: \"9ca7aa92-3367-4c2e-a86e-33ba41fe81cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:23:41 crc kubenswrapper[4792]: E0309 09:23:41.108701 4792 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:23:41 crc kubenswrapper[4792]: E0309 09:23:41.108824 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert podName:9ca7aa92-3367-4c2e-a86e-33ba41fe81cb nodeName:}" failed. No retries permitted until 2026-03-09 09:23:49.108768636 +0000 UTC m=+994.138969408 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" (UID: "9ca7aa92-3367-4c2e-a86e-33ba41fe81cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:23:41 crc kubenswrapper[4792]: I0309 09:23:41.616102 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:41 crc kubenswrapper[4792]: I0309 09:23:41.616568 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:41 crc kubenswrapper[4792]: E0309 09:23:41.616206 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 09:23:41 crc kubenswrapper[4792]: E0309 09:23:41.616711 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs podName:e42c0d5f-7c0c-420f-a14b-59316b524101 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:49.616695284 +0000 UTC m=+994.646896036 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-qh4rf" (UID: "e42c0d5f-7c0c-420f-a14b-59316b524101") : secret "webhook-server-cert" not found Mar 09 09:23:41 crc kubenswrapper[4792]: E0309 09:23:41.616661 4792 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 09:23:41 crc kubenswrapper[4792]: E0309 09:23:41.617024 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs podName:e42c0d5f-7c0c-420f-a14b-59316b524101 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:49.617014234 +0000 UTC m=+994.647214996 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-qh4rf" (UID: "e42c0d5f-7c0c-420f-a14b-59316b524101") : secret "metrics-server-cert" not found Mar 09 09:23:43 crc kubenswrapper[4792]: I0309 09:23:43.214398 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:23:43 crc kubenswrapper[4792]: I0309 09:23:43.214460 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:23:48 crc kubenswrapper[4792]: I0309 09:23:48.613584 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-swtlr\" (UID: \"fe547e1c-cb50-4541-b867-5154dae69ec3\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:23:48 crc kubenswrapper[4792]: E0309 09:23:48.613784 4792 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 09:23:48 crc kubenswrapper[4792]: E0309 09:23:48.614388 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert podName:fe547e1c-cb50-4541-b867-5154dae69ec3 nodeName:}" failed. No retries permitted until 2026-03-09 09:24:04.614369435 +0000 UTC m=+1009.644570187 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert") pod "infra-operator-controller-manager-f7fcc58b9-swtlr" (UID: "fe547e1c-cb50-4541-b867-5154dae69ec3") : secret "infra-operator-webhook-server-cert" not found Mar 09 09:23:49 crc kubenswrapper[4792]: I0309 09:23:49.120625 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4\" (UID: \"9ca7aa92-3367-4c2e-a86e-33ba41fe81cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:23:49 crc kubenswrapper[4792]: I0309 09:23:49.128001 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ca7aa92-3367-4c2e-a86e-33ba41fe81cb-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4\" (UID: \"9ca7aa92-3367-4c2e-a86e-33ba41fe81cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:23:49 crc kubenswrapper[4792]: E0309 09:23:49.172766 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7" Mar 09 09:23:49 crc kubenswrapper[4792]: E0309 09:23:49.172940 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f4hls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9b9ff9f4d-vhf7p_openstack-operators(533287c3-78f0-46ea-baa9-fafb1ce7615b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:23:49 crc kubenswrapper[4792]: E0309 09:23:49.174125 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p" podUID="533287c3-78f0-46ea-baa9-fafb1ce7615b" Mar 09 09:23:49 crc kubenswrapper[4792]: I0309 09:23:49.213376 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:23:49 crc kubenswrapper[4792]: I0309 09:23:49.627192 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:49 crc kubenswrapper[4792]: I0309 09:23:49.627645 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:49 crc kubenswrapper[4792]: E0309 09:23:49.627836 4792 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 09:23:49 crc kubenswrapper[4792]: E0309 09:23:49.627933 4792 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs podName:e42c0d5f-7c0c-420f-a14b-59316b524101 nodeName:}" failed. No retries permitted until 2026-03-09 09:24:05.627912385 +0000 UTC m=+1010.658113127 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-qh4rf" (UID: "e42c0d5f-7c0c-420f-a14b-59316b524101") : secret "webhook-server-cert" not found Mar 09 09:23:49 crc kubenswrapper[4792]: I0309 09:23:49.648466 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:23:49 crc kubenswrapper[4792]: E0309 09:23:49.682835 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p" podUID="533287c3-78f0-46ea-baa9-fafb1ce7615b" Mar 09 09:23:51 crc kubenswrapper[4792]: I0309 09:23:51.025983 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bdllp"] Mar 09 09:23:51 crc kubenswrapper[4792]: I0309 09:23:51.027286 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:23:51 crc kubenswrapper[4792]: I0309 09:23:51.038137 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bdllp"] Mar 09 09:23:51 crc kubenswrapper[4792]: I0309 09:23:51.152253 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sk9b\" (UniqueName: \"kubernetes.io/projected/ea84e439-1556-4df6-81bf-0042e1554901-kube-api-access-4sk9b\") pod \"community-operators-bdllp\" (UID: \"ea84e439-1556-4df6-81bf-0042e1554901\") " pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:23:51 crc kubenswrapper[4792]: I0309 09:23:51.152341 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea84e439-1556-4df6-81bf-0042e1554901-catalog-content\") pod \"community-operators-bdllp\" (UID: \"ea84e439-1556-4df6-81bf-0042e1554901\") " pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:23:51 crc kubenswrapper[4792]: I0309 09:23:51.152406 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea84e439-1556-4df6-81bf-0042e1554901-utilities\") pod \"community-operators-bdllp\" (UID: \"ea84e439-1556-4df6-81bf-0042e1554901\") " pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:23:51 crc kubenswrapper[4792]: I0309 09:23:51.254189 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea84e439-1556-4df6-81bf-0042e1554901-utilities\") pod \"community-operators-bdllp\" (UID: \"ea84e439-1556-4df6-81bf-0042e1554901\") " pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:23:51 crc kubenswrapper[4792]: I0309 09:23:51.254695 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea84e439-1556-4df6-81bf-0042e1554901-utilities\") pod \"community-operators-bdllp\" (UID: \"ea84e439-1556-4df6-81bf-0042e1554901\") " pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:23:51 crc kubenswrapper[4792]: I0309 09:23:51.254801 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sk9b\" (UniqueName: \"kubernetes.io/projected/ea84e439-1556-4df6-81bf-0042e1554901-kube-api-access-4sk9b\") pod \"community-operators-bdllp\" (UID: \"ea84e439-1556-4df6-81bf-0042e1554901\") " pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:23:51 crc kubenswrapper[4792]: I0309 09:23:51.254849 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea84e439-1556-4df6-81bf-0042e1554901-catalog-content\") pod \"community-operators-bdllp\" (UID: \"ea84e439-1556-4df6-81bf-0042e1554901\") " pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:23:51 crc kubenswrapper[4792]: I0309 09:23:51.255212 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea84e439-1556-4df6-81bf-0042e1554901-catalog-content\") pod \"community-operators-bdllp\" (UID: \"ea84e439-1556-4df6-81bf-0042e1554901\") " pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:23:51 crc kubenswrapper[4792]: I0309 09:23:51.275448 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sk9b\" (UniqueName: \"kubernetes.io/projected/ea84e439-1556-4df6-81bf-0042e1554901-kube-api-access-4sk9b\") pod \"community-operators-bdllp\" (UID: \"ea84e439-1556-4df6-81bf-0042e1554901\") " pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:23:51 crc kubenswrapper[4792]: I0309 09:23:51.360511 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:23:51 crc kubenswrapper[4792]: E0309 09:23:51.876669 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd" Mar 09 09:23:51 crc kubenswrapper[4792]: E0309 09:23:51.876838 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ckmnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5d86c7ddb7-r44dt_openstack-operators(c28488b2-919b-4307-9a70-b2f5f1280e2a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:23:51 crc kubenswrapper[4792]: E0309 09:23:51.878016 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt" podUID="c28488b2-919b-4307-9a70-b2f5f1280e2a" Mar 09 09:23:52 crc kubenswrapper[4792]: E0309 09:23:52.665681 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:b242403a27609ac87a0ed3a7dd788aceaf8f3da3620981cf5e000d56862d77a4" Mar 09 09:23:52 crc kubenswrapper[4792]: E0309 09:23:52.665993 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:b242403a27609ac87a0ed3a7dd788aceaf8f3da3620981cf5e000d56862d77a4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4xrbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-54688575f-5k4db_openstack-operators(8fd39edc-ff27-4feb-b138-ee11a440c0ca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:23:52 crc kubenswrapper[4792]: E0309 09:23:52.667476 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5k4db" podUID="8fd39edc-ff27-4feb-b138-ee11a440c0ca" Mar 09 09:23:52 crc kubenswrapper[4792]: E0309 09:23:52.712415 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt" podUID="c28488b2-919b-4307-9a70-b2f5f1280e2a" Mar 09 09:23:52 crc kubenswrapper[4792]: E0309 09:23:52.712467 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:b242403a27609ac87a0ed3a7dd788aceaf8f3da3620981cf5e000d56862d77a4\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5k4db" podUID="8fd39edc-ff27-4feb-b138-ee11a440c0ca" Mar 09 09:23:53 crc kubenswrapper[4792]: E0309 09:23:53.412592 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:e41dfadd2c3bbcae29f8c43cd2feea6724a48cdef127d65d1d37816bb9945a01" Mar 09 09:23:53 crc kubenswrapper[4792]: E0309 09:23:53.412748 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:e41dfadd2c3bbcae29f8c43cd2feea6724a48cdef127d65d1d37816bb9945a01,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zrgkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-545456dc4-x7n9b_openstack-operators(ac60ffe8-71d2-4ea1-bbc5-d377fc70d940): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:23:53 crc kubenswrapper[4792]: E0309 09:23:53.414043 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b" podUID="ac60ffe8-71d2-4ea1-bbc5-d377fc70d940" Mar 09 09:23:53 crc kubenswrapper[4792]: E0309 09:23:53.714705 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:e41dfadd2c3bbcae29f8c43cd2feea6724a48cdef127d65d1d37816bb9945a01\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b" podUID="ac60ffe8-71d2-4ea1-bbc5-d377fc70d940" Mar 09 09:23:55 crc kubenswrapper[4792]: E0309 09:23:55.245262 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97" Mar 09 09:23:55 crc kubenswrapper[4792]: E0309 09:23:55.245738 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8pr76,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-vj8ds_openstack-operators(41f3c31e-77a7-4912-a933-04b32c0db0dc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:23:55 crc kubenswrapper[4792]: E0309 09:23:55.246949 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds" podUID="41f3c31e-77a7-4912-a933-04b32c0db0dc" Mar 09 09:23:55 crc kubenswrapper[4792]: E0309 09:23:55.730760 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds" podUID="41f3c31e-77a7-4912-a933-04b32c0db0dc" Mar 09 09:23:55 crc kubenswrapper[4792]: E0309 09:23:55.901744 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120" Mar 09 09:23:55 crc kubenswrapper[4792]: E0309 09:23:55.901945 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zk6dh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-6db6876945-9jfp7_openstack-operators(d9dc8da2-0584-4db0-ad3a-f1c59c2f6028): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:23:55 crc kubenswrapper[4792]: E0309 09:23:55.903148 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7" podUID="d9dc8da2-0584-4db0-ad3a-f1c59c2f6028" Mar 09 09:23:56 crc kubenswrapper[4792]: E0309 09:23:56.762877 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:3f9b0446a124745439306dc3bb7faec8c02c0b6be33f788b9d455fa57fb60120\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7" podUID="d9dc8da2-0584-4db0-ad3a-f1c59c2f6028" Mar 09 09:23:57 crc kubenswrapper[4792]: E0309 09:23:57.616735 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214" Mar 09 09:23:57 crc kubenswrapper[4792]: E0309 09:23:57.617427 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ghkjg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-5d87c9d997-r5d6p_openstack-operators(20b2fb83-c944-4553-b506-9ff3c9c199f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:23:57 crc kubenswrapper[4792]: E0309 09:23:57.619358 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p" podUID="20b2fb83-c944-4553-b506-9ff3c9c199f5" Mar 09 09:23:59 crc kubenswrapper[4792]: E0309 09:23:59.081687 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214\\\"\"" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p" podUID="20b2fb83-c944-4553-b506-9ff3c9c199f5" Mar 09 09:23:59 crc kubenswrapper[4792]: E0309 09:23:59.440234 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84" Mar 09 09:23:59 crc kubenswrapper[4792]: E0309 09:23:59.440524 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l5gg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-74b6b5dc96-dpvjg_openstack-operators(9063ee68-9840-4f35-8d4d-44ab947477d5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:23:59 crc kubenswrapper[4792]: E0309 09:23:59.441794 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg" podUID="9063ee68-9840-4f35-8d4d-44ab947477d5" Mar 09 09:24:00 crc kubenswrapper[4792]: E0309 09:24:00.002896 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c" Mar 09 09:24:00 crc kubenswrapper[4792]: E0309 09:24:00.003502 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ghq8t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7c789f89c6-dktrj_openstack-operators(55f715a3-ef6e-40d8-9f9b-3100b2847b8d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:24:00 crc kubenswrapper[4792]: E0309 09:24:00.005694 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj" podUID="55f715a3-ef6e-40d8-9f9b-3100b2847b8d" Mar 09 09:24:00 crc kubenswrapper[4792]: E0309 09:24:00.065609 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg" podUID="9063ee68-9840-4f35-8d4d-44ab947477d5" Mar 09 09:24:00 crc kubenswrapper[4792]: E0309 09:24:00.068289 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj" podUID="55f715a3-ef6e-40d8-9f9b-3100b2847b8d" Mar 09 09:24:00 crc kubenswrapper[4792]: I0309 09:24:00.144402 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550804-98tkj"] Mar 09 09:24:00 crc kubenswrapper[4792]: I0309 09:24:00.145284 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550804-98tkj" Mar 09 09:24:00 crc kubenswrapper[4792]: I0309 09:24:00.148589 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:24:00 crc kubenswrapper[4792]: I0309 09:24:00.148633 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:24:00 crc kubenswrapper[4792]: I0309 09:24:00.148815 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:24:00 crc kubenswrapper[4792]: I0309 09:24:00.151640 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550804-98tkj"] Mar 09 09:24:00 crc kubenswrapper[4792]: I0309 09:24:00.274863 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghnhl\" (UniqueName: \"kubernetes.io/projected/f1eec157-3f1a-4ca8-afb9-7dc14b7bc433-kube-api-access-ghnhl\") pod \"auto-csr-approver-29550804-98tkj\" (UID: \"f1eec157-3f1a-4ca8-afb9-7dc14b7bc433\") " pod="openshift-infra/auto-csr-approver-29550804-98tkj" Mar 09 09:24:00 crc kubenswrapper[4792]: I0309 09:24:00.378649 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghnhl\" (UniqueName: \"kubernetes.io/projected/f1eec157-3f1a-4ca8-afb9-7dc14b7bc433-kube-api-access-ghnhl\") pod \"auto-csr-approver-29550804-98tkj\" (UID: \"f1eec157-3f1a-4ca8-afb9-7dc14b7bc433\") " pod="openshift-infra/auto-csr-approver-29550804-98tkj" Mar 09 09:24:00 crc kubenswrapper[4792]: I0309 09:24:00.404134 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghnhl\" (UniqueName: \"kubernetes.io/projected/f1eec157-3f1a-4ca8-afb9-7dc14b7bc433-kube-api-access-ghnhl\") pod \"auto-csr-approver-29550804-98tkj\" (UID: \"f1eec157-3f1a-4ca8-afb9-7dc14b7bc433\") " pod="openshift-infra/auto-csr-approver-29550804-98tkj" Mar 09 09:24:00 crc kubenswrapper[4792]: I0309 09:24:00.473607 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550804-98tkj" Mar 09 09:24:00 crc kubenswrapper[4792]: E0309 09:24:00.521359 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 09 09:24:00 crc kubenswrapper[4792]: E0309 09:24:00.521546 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x9p8x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-kjtwp_openstack-operators(92a6c902-5189-421e-b1a1-ed3e64e7bca4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:24:00 crc kubenswrapper[4792]: E0309 09:24:00.525182 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kjtwp" podUID="92a6c902-5189-421e-b1a1-ed3e64e7bca4" Mar 09 09:24:01 crc kubenswrapper[4792]: E0309 09:24:01.051110 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:ee642fcf655f9897d480460008cba2e98b497d3ffdf7ab1d48ea460eb20c2053" Mar 09 09:24:01 crc kubenswrapper[4792]: E0309 09:24:01.052292 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:ee642fcf655f9897d480460008cba2e98b497d3ffdf7ab1d48ea460eb20c2053,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jmt88,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-cf99c678f-lw2kp_openstack-operators(b1140422-6cf3-4e92-95e2-6ea31179de28): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:24:01 crc kubenswrapper[4792]: E0309 09:24:01.053813 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp" podUID="b1140422-6cf3-4e92-95e2-6ea31179de28" Mar 09 09:24:01 crc kubenswrapper[4792]: E0309 09:24:01.071961 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kjtwp" podUID="92a6c902-5189-421e-b1a1-ed3e64e7bca4" Mar 09 09:24:01 crc kubenswrapper[4792]: E0309 09:24:01.075618 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:ee642fcf655f9897d480460008cba2e98b497d3ffdf7ab1d48ea460eb20c2053\\\"\"" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp" podUID="b1140422-6cf3-4e92-95e2-6ea31179de28" Mar 09 09:24:01 crc kubenswrapper[4792]: I0309 09:24:01.561381 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bdllp"] Mar 09 09:24:02 crc kubenswrapper[4792]: E0309 09:24:02.337314 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6" Mar 09 09:24:02 crc kubenswrapper[4792]: E0309 09:24:02.337780 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bkbrz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5fdb694969-mzpqx_openstack-operators(d4313901-b530-42e8-a975-d21aefbc0506): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:24:02 crc kubenswrapper[4792]: E0309 09:24:02.338991 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" podUID="d4313901-b530-42e8-a975-d21aefbc0506" Mar 09 09:24:03 crc kubenswrapper[4792]: I0309 09:24:03.709205 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4"] Mar 09 09:24:03 crc kubenswrapper[4792]: W0309 09:24:03.803219 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ca7aa92_3367_4c2e_a86e_33ba41fe81cb.slice/crio-49ef41633ac0ac6ddf4806e6774f97f03aeee4f53448eca4bb30646251a4219c WatchSource:0}: Error finding container 49ef41633ac0ac6ddf4806e6774f97f03aeee4f53448eca4bb30646251a4219c: Status 404 returned error can't find the container with id 49ef41633ac0ac6ddf4806e6774f97f03aeee4f53448eca4bb30646251a4219c Mar 09 09:24:03 crc kubenswrapper[4792]: I0309 09:24:03.843532 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550804-98tkj"] Mar 09 09:24:03 crc kubenswrapper[4792]: W0309 09:24:03.861917 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1eec157_3f1a_4ca8_afb9_7dc14b7bc433.slice/crio-cf78384b45d98ca7f3e48128d369f265c748d4f5efc3eab1b63ebdb65953a824 WatchSource:0}: Error finding container cf78384b45d98ca7f3e48128d369f265c748d4f5efc3eab1b63ebdb65953a824: Status 404 returned error can't find the container with id cf78384b45d98ca7f3e48128d369f265c748d4f5efc3eab1b63ebdb65953a824 Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.093881 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s" event={"ID":"82689eba-1f75-4e2e-8c27-a5b90e2805af","Type":"ContainerStarted","Data":"fd2930fe9f3186a171116a69f94d5e8d6cf1bbc447d75608fa559e4610c99cbf"} Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.094089 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.095986 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7fdsl" event={"ID":"89b0f1f9-11f1-4d01-a2b8-ca2f1fae3bb2","Type":"ContainerStarted","Data":"297e7348d8fe14fd2254efb6563ba0c0797660645a8dc495b2ebe605adfea071"} Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.096295 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7fdsl" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.098200 4792 generic.go:334] "Generic (PLEG): container finished" podID="ea84e439-1556-4df6-81bf-0042e1554901" containerID="ebd7a8076890c7074f5edc74056da7c275b812b8eda600f67ba95f2f42f93a48" exitCode=0 Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.098264 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdllp" event={"ID":"ea84e439-1556-4df6-81bf-0042e1554901","Type":"ContainerDied","Data":"ebd7a8076890c7074f5edc74056da7c275b812b8eda600f67ba95f2f42f93a48"} Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.098287 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdllp" event={"ID":"ea84e439-1556-4df6-81bf-0042e1554901","Type":"ContainerStarted","Data":"2a7c64478844c0c74b40239ee9b87dab9512360a321006a5fc70d17b07d3aff4"} Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.101527 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4775c" event={"ID":"2c678a62-a744-4384-8403-618b566ed91e","Type":"ContainerStarted","Data":"06af8d4af8cbf63e92a54ca8f67ac80dd894a26294fd53bfa4a584543332fb81"} Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.101993 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4775c" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.103625 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" event={"ID":"9ca7aa92-3367-4c2e-a86e-33ba41fe81cb","Type":"ContainerStarted","Data":"49ef41633ac0ac6ddf4806e6774f97f03aeee4f53448eca4bb30646251a4219c"} Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.106991 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-jhmcx" event={"ID":"98ba9a2a-30d6-45f2-af47-2994c292fe05","Type":"ContainerStarted","Data":"727f182f22aeb9725e589d0ae29a97b40416d14f5a5c2e02e025552315b40fd8"} Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.107370 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-jhmcx" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.108933 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts" event={"ID":"d53acf43-fee2-4bdf-9cdb-883641a56d48","Type":"ContainerStarted","Data":"846939d2114e0124b0f4fffc90b49812cbce523d6dc4702c1190a5ae0f75702f"} Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.109318 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.110457 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-tfz6b" event={"ID":"b74999f3-cb46-4b35-a70f-71977b54d944","Type":"ContainerStarted","Data":"19b3597e89cfc3613051880ac1a88ebad02b2576b8caa20d82662925c5d3ee68"} Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.110803 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-tfz6b" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.111479 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550804-98tkj" event={"ID":"f1eec157-3f1a-4ca8-afb9-7dc14b7bc433","Type":"ContainerStarted","Data":"cf78384b45d98ca7f3e48128d369f265c748d4f5efc3eab1b63ebdb65953a824"} Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.113049 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ckrbc" event={"ID":"e27b7b35-b064-4e02-99e6-cb34af5ff0e9","Type":"ContainerStarted","Data":"f6e8c0a88894c327d71d4ffc5f216292cddbb20a7e1fbf61417fa1cb22e25202"} Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.113411 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ckrbc" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.114915 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh" event={"ID":"e56405f7-7121-4d52-b276-3feeddabd667","Type":"ContainerStarted","Data":"176d0a655f48c9eaf375692935040e2f3298b2e8a4cc1b85480ce618cb38f0a8"} Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.115194 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.122716 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s" podStartSLOduration=4.181635925 podStartE2EDuration="32.122698256s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:35.619186904 +0000 UTC m=+980.649387656" lastFinishedPulling="2026-03-09 09:24:03.560249235 +0000 UTC m=+1008.590449987" observedRunningTime="2026-03-09 09:24:04.121038958 +0000 UTC m=+1009.151239730" watchObservedRunningTime="2026-03-09 09:24:04.122698256 +0000 UTC m=+1009.152899008" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.168428 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts" podStartSLOduration=3.222167842 podStartE2EDuration="31.168412793s" podCreationTimestamp="2026-03-09 09:23:33 +0000 UTC" firstStartedPulling="2026-03-09 09:23:35.624929192 +0000 UTC m=+980.655129944" lastFinishedPulling="2026-03-09 09:24:03.571174143 +0000 UTC m=+1008.601374895" observedRunningTime="2026-03-09 09:24:04.149709597 +0000 UTC m=+1009.179910349" watchObservedRunningTime="2026-03-09 09:24:04.168412793 +0000 UTC m=+1009.198613545" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.171682 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4775c" podStartSLOduration=6.437076871 podStartE2EDuration="32.171673158s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:35.304810783 +0000 UTC m=+980.335011535" lastFinishedPulling="2026-03-09 09:24:01.03940707 +0000 UTC m=+1006.069607822" observedRunningTime="2026-03-09 09:24:04.165958151 +0000 UTC m=+1009.196158903" watchObservedRunningTime="2026-03-09 09:24:04.171673158 +0000 UTC m=+1009.201873910" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.204003 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-jhmcx" podStartSLOduration=6.269113651 podStartE2EDuration="32.203986243s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:35.104693952 +0000 UTC m=+980.134894704" lastFinishedPulling="2026-03-09 09:24:01.039566544 +0000 UTC m=+1006.069767296" observedRunningTime="2026-03-09 09:24:04.199249575 +0000 UTC m=+1009.229450327" watchObservedRunningTime="2026-03-09 09:24:04.203986243 +0000 UTC m=+1009.234187005" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.221157 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-tfz6b" podStartSLOduration=6.073365489 podStartE2EDuration="32.221139535s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:34.900271816 +0000 UTC m=+979.930472568" lastFinishedPulling="2026-03-09 09:24:01.048045862 +0000 UTC m=+1006.078246614" observedRunningTime="2026-03-09 09:24:04.213644725 +0000 UTC m=+1009.243845487" watchObservedRunningTime="2026-03-09 09:24:04.221139535 +0000 UTC m=+1009.251340297" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.240767 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ckrbc" podStartSLOduration=6.294184214 podStartE2EDuration="32.240750008s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:35.091293521 +0000 UTC m=+980.121494273" lastFinishedPulling="2026-03-09 09:24:01.037859315 +0000 UTC m=+1006.068060067" observedRunningTime="2026-03-09 09:24:04.237960547 +0000 UTC m=+1009.268161299" watchObservedRunningTime="2026-03-09 09:24:04.240750008 +0000 UTC m=+1009.270950760" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.298997 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7fdsl" podStartSLOduration=6.359545905 podStartE2EDuration="32.29898193s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:34.561853744 +0000 UTC m=+979.592054496" lastFinishedPulling="2026-03-09 09:24:00.501289769 +0000 UTC m=+1005.531490521" observedRunningTime="2026-03-09 09:24:04.294603822 +0000 UTC m=+1009.324804574" watchObservedRunningTime="2026-03-09 09:24:04.29898193 +0000 UTC m=+1009.329182682" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.299231 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh" podStartSLOduration=3.393995833 podStartE2EDuration="31.299226187s" podCreationTimestamp="2026-03-09 09:23:33 +0000 UTC" firstStartedPulling="2026-03-09 09:23:35.655017661 +0000 UTC m=+980.685218413" lastFinishedPulling="2026-03-09 09:24:03.560248015 +0000 UTC m=+1008.590448767" observedRunningTime="2026-03-09 09:24:04.270669893 +0000 UTC m=+1009.300870645" watchObservedRunningTime="2026-03-09 09:24:04.299226187 +0000 UTC m=+1009.329426939" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.697828 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-swtlr\" (UID: \"fe547e1c-cb50-4541-b867-5154dae69ec3\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.720936 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe547e1c-cb50-4541-b867-5154dae69ec3-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-swtlr\" (UID: \"fe547e1c-cb50-4541-b867-5154dae69ec3\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:24:04 crc kubenswrapper[4792]: I0309 09:24:04.886491 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:24:05 crc kubenswrapper[4792]: I0309 09:24:05.544967 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr"] Mar 09 09:24:05 crc kubenswrapper[4792]: I0309 09:24:05.719897 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:24:05 crc kubenswrapper[4792]: I0309 09:24:05.725159 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e42c0d5f-7c0c-420f-a14b-59316b524101-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-qh4rf\" (UID: \"e42c0d5f-7c0c-420f-a14b-59316b524101\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:24:05 crc kubenswrapper[4792]: I0309 09:24:05.944965 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:24:06 crc kubenswrapper[4792]: I0309 09:24:06.136655 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" event={"ID":"fe547e1c-cb50-4541-b867-5154dae69ec3","Type":"ContainerStarted","Data":"530ccd7087185f2c2558758ab78fbe5ae74e825283d1333fca64440b179df21f"} Mar 09 09:24:06 crc kubenswrapper[4792]: I0309 09:24:06.454100 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf"] Mar 09 09:24:06 crc kubenswrapper[4792]: W0309 09:24:06.470586 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode42c0d5f_7c0c_420f_a14b_59316b524101.slice/crio-a6af0d167b6664349f15bed9bb175ec9b42851455730af81cb618b63caa840db WatchSource:0}: Error finding container a6af0d167b6664349f15bed9bb175ec9b42851455730af81cb618b63caa840db: Status 404 returned error can't find the container with id a6af0d167b6664349f15bed9bb175ec9b42851455730af81cb618b63caa840db Mar 09 09:24:07 crc kubenswrapper[4792]: I0309 09:24:07.148921 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p" event={"ID":"533287c3-78f0-46ea-baa9-fafb1ce7615b","Type":"ContainerStarted","Data":"f27c75a82a6ad2f733c7499cb2186a4aa2e1eeb3997b47de43a70920e18da807"} Mar 09 09:24:07 crc kubenswrapper[4792]: I0309 09:24:07.149536 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p" Mar 09 09:24:07 crc kubenswrapper[4792]: I0309 09:24:07.155228 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550804-98tkj" event={"ID":"f1eec157-3f1a-4ca8-afb9-7dc14b7bc433","Type":"ContainerStarted","Data":"97a93524fa266ceaa10e471f79bcccc036fd79b65c803c32c104f10b86e35c1d"} Mar 09 09:24:07 crc kubenswrapper[4792]: I0309 09:24:07.157682 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" event={"ID":"e42c0d5f-7c0c-420f-a14b-59316b524101","Type":"ContainerStarted","Data":"d3e1c2dc48687b165172d7697ac66108b2ad85c90b2a8f4af661456badb07afd"} Mar 09 09:24:07 crc kubenswrapper[4792]: I0309 09:24:07.157712 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" event={"ID":"e42c0d5f-7c0c-420f-a14b-59316b524101","Type":"ContainerStarted","Data":"a6af0d167b6664349f15bed9bb175ec9b42851455730af81cb618b63caa840db"} Mar 09 09:24:07 crc kubenswrapper[4792]: I0309 09:24:07.157821 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:24:07 crc kubenswrapper[4792]: I0309 09:24:07.159554 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdllp" event={"ID":"ea84e439-1556-4df6-81bf-0042e1554901","Type":"ContainerStarted","Data":"c012c80daaf47bd044b3b3b9f4de1479df4b268dd237d1601f508140add9df10"} Mar 09 09:24:07 crc kubenswrapper[4792]: I0309 09:24:07.167057 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p" podStartSLOduration=2.786318779 podStartE2EDuration="34.167041615s" podCreationTimestamp="2026-03-09 09:23:33 +0000 UTC" firstStartedPulling="2026-03-09 09:23:35.437912014 +0000 UTC m=+980.468112756" lastFinishedPulling="2026-03-09 09:24:06.81863485 +0000 UTC m=+1011.848835592" observedRunningTime="2026-03-09 09:24:07.164941414 +0000 UTC m=+1012.195142166" watchObservedRunningTime="2026-03-09 09:24:07.167041615 +0000 UTC m=+1012.197242367" Mar 09 09:24:07 crc kubenswrapper[4792]: I0309 09:24:07.281231 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550804-98tkj" podStartSLOduration=4.597505579 podStartE2EDuration="7.281212643s" podCreationTimestamp="2026-03-09 09:24:00 +0000 UTC" firstStartedPulling="2026-03-09 09:24:03.866741315 +0000 UTC m=+1008.896942067" lastFinishedPulling="2026-03-09 09:24:06.550448389 +0000 UTC m=+1011.580649131" observedRunningTime="2026-03-09 09:24:07.227448021 +0000 UTC m=+1012.257648773" watchObservedRunningTime="2026-03-09 09:24:07.281212643 +0000 UTC m=+1012.311413395" Mar 09 09:24:07 crc kubenswrapper[4792]: I0309 09:24:07.281559 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" podStartSLOduration=34.281552642 podStartE2EDuration="34.281552642s" podCreationTimestamp="2026-03-09 09:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:07.280724069 +0000 UTC m=+1012.310924831" watchObservedRunningTime="2026-03-09 09:24:07.281552642 +0000 UTC m=+1012.311753394" Mar 09 09:24:08 crc kubenswrapper[4792]: I0309 09:24:08.165671 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1eec157-3f1a-4ca8-afb9-7dc14b7bc433" containerID="97a93524fa266ceaa10e471f79bcccc036fd79b65c803c32c104f10b86e35c1d" exitCode=0 Mar 09 09:24:08 crc kubenswrapper[4792]: I0309 09:24:08.166649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550804-98tkj" event={"ID":"f1eec157-3f1a-4ca8-afb9-7dc14b7bc433","Type":"ContainerDied","Data":"97a93524fa266ceaa10e471f79bcccc036fd79b65c803c32c104f10b86e35c1d"} Mar 09 09:24:08 crc kubenswrapper[4792]: I0309 09:24:08.168427 4792 generic.go:334] "Generic (PLEG): container finished" podID="ea84e439-1556-4df6-81bf-0042e1554901" containerID="c012c80daaf47bd044b3b3b9f4de1479df4b268dd237d1601f508140add9df10" exitCode=0 Mar 09 09:24:08 crc kubenswrapper[4792]: I0309 09:24:08.168467 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdllp" event={"ID":"ea84e439-1556-4df6-81bf-0042e1554901","Type":"ContainerDied","Data":"c012c80daaf47bd044b3b3b9f4de1479df4b268dd237d1601f508140add9df10"} Mar 09 09:24:09 crc kubenswrapper[4792]: I0309 09:24:09.444777 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550804-98tkj" Mar 09 09:24:09 crc kubenswrapper[4792]: I0309 09:24:09.568777 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghnhl\" (UniqueName: \"kubernetes.io/projected/f1eec157-3f1a-4ca8-afb9-7dc14b7bc433-kube-api-access-ghnhl\") pod \"f1eec157-3f1a-4ca8-afb9-7dc14b7bc433\" (UID: \"f1eec157-3f1a-4ca8-afb9-7dc14b7bc433\") " Mar 09 09:24:09 crc kubenswrapper[4792]: I0309 09:24:09.573839 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1eec157-3f1a-4ca8-afb9-7dc14b7bc433-kube-api-access-ghnhl" (OuterVolumeSpecName: "kube-api-access-ghnhl") pod "f1eec157-3f1a-4ca8-afb9-7dc14b7bc433" (UID: "f1eec157-3f1a-4ca8-afb9-7dc14b7bc433"). InnerVolumeSpecName "kube-api-access-ghnhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:09 crc kubenswrapper[4792]: I0309 09:24:09.670387 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghnhl\" (UniqueName: \"kubernetes.io/projected/f1eec157-3f1a-4ca8-afb9-7dc14b7bc433-kube-api-access-ghnhl\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:10 crc kubenswrapper[4792]: I0309 09:24:10.182321 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550804-98tkj" event={"ID":"f1eec157-3f1a-4ca8-afb9-7dc14b7bc433","Type":"ContainerDied","Data":"cf78384b45d98ca7f3e48128d369f265c748d4f5efc3eab1b63ebdb65953a824"} Mar 09 09:24:10 crc kubenswrapper[4792]: I0309 09:24:10.182363 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf78384b45d98ca7f3e48128d369f265c748d4f5efc3eab1b63ebdb65953a824" Mar 09 09:24:10 crc kubenswrapper[4792]: I0309 09:24:10.182389 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550804-98tkj" Mar 09 09:24:10 crc kubenswrapper[4792]: I0309 09:24:10.505653 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550798-vnc97"] Mar 09 09:24:10 crc kubenswrapper[4792]: I0309 09:24:10.512953 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550798-vnc97"] Mar 09 09:24:11 crc kubenswrapper[4792]: I0309 09:24:11.704721 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf209962-051d-4ac0-9cc9-ba55de9e97cc" path="/var/lib/kubelet/pods/cf209962-051d-4ac0-9cc9-ba55de9e97cc/volumes" Mar 09 09:24:12 crc kubenswrapper[4792]: I0309 09:24:12.840670 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-7fdsl" Mar 09 09:24:12 crc kubenswrapper[4792]: I0309 09:24:12.932329 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-tfz6b" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.082053 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-jhmcx" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.214135 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.214187 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.214227 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.214801 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db2023e6b3ec28be4276e65d0d9cd090ae22fa8851acb261970e9cecf046c144"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.214856 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://db2023e6b3ec28be4276e65d0d9cd090ae22fa8851acb261970e9cecf046c144" gracePeriod=600 Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.216372 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-4775c" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.241180 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt" event={"ID":"c28488b2-919b-4307-9a70-b2f5f1280e2a","Type":"ContainerStarted","Data":"a3f032f9d79e9c279188a7ad2c833d846a8338c47285222965c793a75fbeebf3"} Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.241777 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.251535 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdllp" event={"ID":"ea84e439-1556-4df6-81bf-0042e1554901","Type":"ContainerStarted","Data":"bd94222d429d2e6268f2c6a00365456caa95f949339c66f3da6e238f6f081aa1"} Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.258099 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5k4db" event={"ID":"8fd39edc-ff27-4feb-b138-ee11a440c0ca","Type":"ContainerStarted","Data":"bf3abec17da4402040b144bb84a435cdaefff6ba24d90aacf94eb1125e39c995"} Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.258295 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5k4db" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.260049 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" event={"ID":"9ca7aa92-3367-4c2e-a86e-33ba41fe81cb","Type":"ContainerStarted","Data":"720205d4d051163ec3b4656b00869cd1ca0e8bf3b0c8beb5da3f3a552a70aafe"} Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.260122 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.263318 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7" event={"ID":"d9dc8da2-0584-4db0-ad3a-f1c59c2f6028","Type":"ContainerStarted","Data":"2a9092a771432f76ef745add0ada16a30cfa40480a57ae45c82d83d0c86a4a0d"} Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.263531 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.266389 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" event={"ID":"fe547e1c-cb50-4541-b867-5154dae69ec3","Type":"ContainerStarted","Data":"4c85f112efdcfdb0c51dc1d6cdf80a1d34c2d3634270d3b1e339665a38d81cdc"} Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.266531 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.267849 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p" event={"ID":"20b2fb83-c944-4553-b506-9ff3c9c199f5","Type":"ContainerStarted","Data":"1392aec3a1e7478daac654db6665ae2e4bafbcee4bde06083657f9055079f4d3"} Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.268034 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.270950 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds" event={"ID":"41f3c31e-77a7-4912-a933-04b32c0db0dc","Type":"ContainerStarted","Data":"b2ee3812b6c4aa2000cc5b1f76a1aca3552a3bfe2267989ec5cc4db94dc07dc3"} Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.271137 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.273304 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp" event={"ID":"b1140422-6cf3-4e92-95e2-6ea31179de28","Type":"ContainerStarted","Data":"8aaa26016dd5d2af13fff5f0938ff001229d67666a1334911310fb62b7d425b4"} Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.273489 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.274323 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj" event={"ID":"55f715a3-ef6e-40d8-9f9b-3100b2847b8d","Type":"ContainerStarted","Data":"a861aa4c0931e46636532037e71aa21f944884232e346edad17e2bb503e93fa7"} Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.274487 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.275372 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b" event={"ID":"ac60ffe8-71d2-4ea1-bbc5-d377fc70d940","Type":"ContainerStarted","Data":"c9dc781dc98e2c848c28cd062595d5770cdec1bccbf5fed1a9a50ce67ac451c1"} Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.275519 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.280402 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-ckrbc" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.302622 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5k4db" podStartSLOduration=4.11707789 podStartE2EDuration="41.302593352s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:35.204669906 +0000 UTC m=+980.234870648" lastFinishedPulling="2026-03-09 09:24:12.390185358 +0000 UTC m=+1017.420386110" observedRunningTime="2026-03-09 09:24:13.29603327 +0000 UTC m=+1018.326234022" watchObservedRunningTime="2026-03-09 09:24:13.302593352 +0000 UTC m=+1018.332794104" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.391845 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bdllp" podStartSLOduration=15.142993964 podStartE2EDuration="23.39182694s" podCreationTimestamp="2026-03-09 09:23:50 +0000 UTC" firstStartedPulling="2026-03-09 09:24:04.101148097 +0000 UTC m=+1009.131348839" lastFinishedPulling="2026-03-09 09:24:12.349981063 +0000 UTC m=+1017.380181815" observedRunningTime="2026-03-09 09:24:13.32785062 +0000 UTC m=+1018.358051382" watchObservedRunningTime="2026-03-09 09:24:13.39182694 +0000 UTC m=+1018.422027682" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.392107 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt" podStartSLOduration=4.471353727 podStartE2EDuration="41.392102099s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:35.443458006 +0000 UTC m=+980.473658758" lastFinishedPulling="2026-03-09 09:24:12.364206378 +0000 UTC m=+1017.394407130" observedRunningTime="2026-03-09 09:24:13.387623777 +0000 UTC m=+1018.417824529" watchObservedRunningTime="2026-03-09 09:24:13.392102099 +0000 UTC m=+1018.422302841" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.540824 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" podStartSLOduration=33.017941059 podStartE2EDuration="41.540805676s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:24:03.826955101 +0000 UTC m=+1008.857155843" lastFinishedPulling="2026-03-09 09:24:12.349819708 +0000 UTC m=+1017.380020460" observedRunningTime="2026-03-09 09:24:13.538513759 +0000 UTC m=+1018.568714511" watchObservedRunningTime="2026-03-09 09:24:13.540805676 +0000 UTC m=+1018.571006428" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.543995 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7" podStartSLOduration=3.164675896 podStartE2EDuration="41.543980668s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:34.045258812 +0000 UTC m=+979.075459564" lastFinishedPulling="2026-03-09 09:24:12.424563584 +0000 UTC m=+1017.454764336" observedRunningTime="2026-03-09 09:24:13.474129926 +0000 UTC m=+1018.504330678" watchObservedRunningTime="2026-03-09 09:24:13.543980668 +0000 UTC m=+1018.574181420" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.759147 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-8vt8s" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.807195 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp" podStartSLOduration=4.40820338 podStartE2EDuration="41.807177193s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:34.964868065 +0000 UTC m=+979.995068817" lastFinishedPulling="2026-03-09 09:24:12.363841878 +0000 UTC m=+1017.394042630" observedRunningTime="2026-03-09 09:24:13.803964479 +0000 UTC m=+1018.834165231" watchObservedRunningTime="2026-03-09 09:24:13.807177193 +0000 UTC m=+1018.837377945" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.807643 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p" podStartSLOduration=4.144007335 podStartE2EDuration="41.807636736s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:34.761343895 +0000 UTC m=+979.791544647" lastFinishedPulling="2026-03-09 09:24:12.424973296 +0000 UTC m=+1017.455174048" observedRunningTime="2026-03-09 09:24:13.706103898 +0000 UTC m=+1018.736304650" watchObservedRunningTime="2026-03-09 09:24:13.807636736 +0000 UTC m=+1018.837837488" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.867124 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-z5tts" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.923344 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-vhf7p" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.958291 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b" podStartSLOduration=4.575277544 podStartE2EDuration="41.95826704s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:34.982281884 +0000 UTC m=+980.012482636" lastFinishedPulling="2026-03-09 09:24:12.36527138 +0000 UTC m=+1017.395472132" observedRunningTime="2026-03-09 09:24:13.949609227 +0000 UTC m=+1018.979809989" watchObservedRunningTime="2026-03-09 09:24:13.95826704 +0000 UTC m=+1018.988467792" Mar 09 09:24:13 crc kubenswrapper[4792]: I0309 09:24:13.992539 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-z4lgh" Mar 09 09:24:14 crc kubenswrapper[4792]: I0309 09:24:14.004911 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds" podStartSLOduration=4.240948055 podStartE2EDuration="41.004891893s" podCreationTimestamp="2026-03-09 09:23:33 +0000 UTC" firstStartedPulling="2026-03-09 09:23:35.600841628 +0000 UTC m=+980.631042380" lastFinishedPulling="2026-03-09 09:24:12.364785466 +0000 UTC m=+1017.394986218" observedRunningTime="2026-03-09 09:24:14.003973126 +0000 UTC m=+1019.034173878" watchObservedRunningTime="2026-03-09 09:24:14.004891893 +0000 UTC m=+1019.035092645" Mar 09 09:24:14 crc kubenswrapper[4792]: I0309 09:24:14.106297 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" podStartSLOduration=35.324789248 podStartE2EDuration="42.106279677s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:24:05.569670568 +0000 UTC m=+1010.599871320" lastFinishedPulling="2026-03-09 09:24:12.351160997 +0000 UTC m=+1017.381361749" observedRunningTime="2026-03-09 09:24:14.10092921 +0000 UTC m=+1019.131129962" watchObservedRunningTime="2026-03-09 09:24:14.106279677 +0000 UTC m=+1019.136480429" Mar 09 09:24:14 crc kubenswrapper[4792]: I0309 09:24:14.284329 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="db2023e6b3ec28be4276e65d0d9cd090ae22fa8851acb261970e9cecf046c144" exitCode=0 Mar 09 09:24:14 crc kubenswrapper[4792]: I0309 09:24:14.284382 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"db2023e6b3ec28be4276e65d0d9cd090ae22fa8851acb261970e9cecf046c144"} Mar 09 09:24:14 crc kubenswrapper[4792]: I0309 09:24:14.284412 4792 scope.go:117] "RemoveContainer" containerID="7ebe6e06d3acdb8dc390125e4fb4991f20773eff67765cf4ee7e42fe0e4d4167" Mar 09 09:24:14 crc kubenswrapper[4792]: I0309 09:24:14.288755 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg" event={"ID":"9063ee68-9840-4f35-8d4d-44ab947477d5","Type":"ContainerStarted","Data":"e5ffe5ef2064998d0c6997cd635f1a923cf25476603abc4f0ec1017fa4749c4f"} Mar 09 09:24:14 crc kubenswrapper[4792]: I0309 09:24:14.289058 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg" Mar 09 09:24:14 crc kubenswrapper[4792]: I0309 09:24:14.306857 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj" podStartSLOduration=4.489570969 podStartE2EDuration="42.30683517s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:34.608377714 +0000 UTC m=+979.638578466" lastFinishedPulling="2026-03-09 09:24:12.425641915 +0000 UTC m=+1017.455842667" observedRunningTime="2026-03-09 09:24:14.204295612 +0000 UTC m=+1019.234496354" watchObservedRunningTime="2026-03-09 09:24:14.30683517 +0000 UTC m=+1019.337035922" Mar 09 09:24:14 crc kubenswrapper[4792]: I0309 09:24:14.741431 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg" podStartSLOduration=5.061261772 podStartE2EDuration="42.741415484s" podCreationTimestamp="2026-03-09 09:23:32 +0000 UTC" firstStartedPulling="2026-03-09 09:23:35.420197617 +0000 UTC m=+980.450398369" lastFinishedPulling="2026-03-09 09:24:13.100351329 +0000 UTC m=+1018.130552081" observedRunningTime="2026-03-09 09:24:14.729558488 +0000 UTC m=+1019.759759260" watchObservedRunningTime="2026-03-09 09:24:14.741415484 +0000 UTC m=+1019.771616236" Mar 09 09:24:15 crc kubenswrapper[4792]: I0309 09:24:15.297793 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"338559ddc83aaf62922dd4a2c3548afe39aad0e71765a9e21715a3c207fc6015"} Mar 09 09:24:15 crc kubenswrapper[4792]: I0309 09:24:15.299757 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kjtwp" event={"ID":"92a6c902-5189-421e-b1a1-ed3e64e7bca4","Type":"ContainerStarted","Data":"727a9219cdab0454bd37dcadbf752accb6fdc0832646f938f3e49c7a73109369"} Mar 09 09:24:15 crc kubenswrapper[4792]: I0309 09:24:15.396920 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kjtwp" podStartSLOduration=3.8581402320000002 podStartE2EDuration="42.396900826s" podCreationTimestamp="2026-03-09 09:23:33 +0000 UTC" firstStartedPulling="2026-03-09 09:23:35.559275392 +0000 UTC m=+980.589476144" lastFinishedPulling="2026-03-09 09:24:14.098035986 +0000 UTC m=+1019.128236738" observedRunningTime="2026-03-09 09:24:15.394555789 +0000 UTC m=+1020.424756541" watchObservedRunningTime="2026-03-09 09:24:15.396900826 +0000 UTC m=+1020.427101578" Mar 09 09:24:15 crc kubenswrapper[4792]: E0309 09:24:15.677010 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:1b9074a4ce16396d8bd2d30a475fc8c2f004f75a023e3eef8950661e89c0bcc6\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" podUID="d4313901-b530-42e8-a975-d21aefbc0506" Mar 09 09:24:15 crc kubenswrapper[4792]: I0309 09:24:15.954008 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-qh4rf" Mar 09 09:24:19 crc kubenswrapper[4792]: I0309 09:24:19.220408 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4" Mar 09 09:24:21 crc kubenswrapper[4792]: I0309 09:24:21.361698 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:24:21 crc kubenswrapper[4792]: I0309 09:24:21.362301 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:24:22 crc kubenswrapper[4792]: I0309 09:24:22.401714 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-bdllp" podUID="ea84e439-1556-4df6-81bf-0042e1554901" containerName="registry-server" probeResult="failure" output=< Mar 09 09:24:22 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 09:24:22 crc kubenswrapper[4792]: > Mar 09 09:24:22 crc kubenswrapper[4792]: I0309 09:24:22.820829 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-9jfp7" Mar 09 09:24:22 crc kubenswrapper[4792]: I0309 09:24:22.880669 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-r5d6p" Mar 09 09:24:22 crc kubenswrapper[4792]: I0309 09:24:22.957653 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-lw2kp" Mar 09 09:24:23 crc kubenswrapper[4792]: I0309 09:24:23.130164 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-x7n9b" Mar 09 09:24:23 crc kubenswrapper[4792]: I0309 09:24:23.150263 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-dktrj" Mar 09 09:24:23 crc kubenswrapper[4792]: I0309 09:24:23.369103 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-dpvjg" Mar 09 09:24:23 crc kubenswrapper[4792]: I0309 09:24:23.390469 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5k4db" Mar 09 09:24:23 crc kubenswrapper[4792]: I0309 09:24:23.514642 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-r44dt" Mar 09 09:24:24 crc kubenswrapper[4792]: I0309 09:24:24.302758 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vj8ds" Mar 09 09:24:24 crc kubenswrapper[4792]: I0309 09:24:24.893789 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-swtlr" Mar 09 09:24:27 crc kubenswrapper[4792]: I0309 09:24:27.040594 4792 scope.go:117] "RemoveContainer" containerID="0e6ec5fc1a8e9497d34c0082dc535ea6ffdfa063b37c982cb2bec605f6c718fb" Mar 09 09:24:28 crc kubenswrapper[4792]: I0309 09:24:28.391491 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" event={"ID":"d4313901-b530-42e8-a975-d21aefbc0506","Type":"ContainerStarted","Data":"e7aa3add52a759bffdc0ba0af2ec1a0b21f9f81e57808d57027d94c712568a7c"} Mar 09 09:24:28 crc kubenswrapper[4792]: I0309 09:24:28.391765 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" Mar 09 09:24:28 crc kubenswrapper[4792]: I0309 09:24:28.408737 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" podStartSLOduration=2.9741712319999998 podStartE2EDuration="55.408722645s" podCreationTimestamp="2026-03-09 09:23:33 +0000 UTC" firstStartedPulling="2026-03-09 09:23:35.618759312 +0000 UTC m=+980.648960064" lastFinishedPulling="2026-03-09 09:24:28.053310725 +0000 UTC m=+1033.083511477" observedRunningTime="2026-03-09 09:24:28.404496381 +0000 UTC m=+1033.434697153" watchObservedRunningTime="2026-03-09 09:24:28.408722645 +0000 UTC m=+1033.438923387" Mar 09 09:24:31 crc kubenswrapper[4792]: I0309 09:24:31.419228 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:24:31 crc kubenswrapper[4792]: I0309 09:24:31.472998 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:24:31 crc kubenswrapper[4792]: I0309 09:24:31.657227 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bdllp"] Mar 09 09:24:33 crc kubenswrapper[4792]: I0309 09:24:33.420081 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bdllp" podUID="ea84e439-1556-4df6-81bf-0042e1554901" containerName="registry-server" containerID="cri-o://bd94222d429d2e6268f2c6a00365456caa95f949339c66f3da6e238f6f081aa1" gracePeriod=2 Mar 09 09:24:33 crc kubenswrapper[4792]: I0309 09:24:33.785773 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:24:33 crc kubenswrapper[4792]: I0309 09:24:33.903422 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea84e439-1556-4df6-81bf-0042e1554901-catalog-content\") pod \"ea84e439-1556-4df6-81bf-0042e1554901\" (UID: \"ea84e439-1556-4df6-81bf-0042e1554901\") " Mar 09 09:24:33 crc kubenswrapper[4792]: I0309 09:24:33.903477 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sk9b\" (UniqueName: \"kubernetes.io/projected/ea84e439-1556-4df6-81bf-0042e1554901-kube-api-access-4sk9b\") pod \"ea84e439-1556-4df6-81bf-0042e1554901\" (UID: \"ea84e439-1556-4df6-81bf-0042e1554901\") " Mar 09 09:24:33 crc kubenswrapper[4792]: I0309 09:24:33.903509 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea84e439-1556-4df6-81bf-0042e1554901-utilities\") pod \"ea84e439-1556-4df6-81bf-0042e1554901\" (UID: \"ea84e439-1556-4df6-81bf-0042e1554901\") " Mar 09 09:24:33 crc kubenswrapper[4792]: I0309 09:24:33.904570 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea84e439-1556-4df6-81bf-0042e1554901-utilities" (OuterVolumeSpecName: "utilities") pod "ea84e439-1556-4df6-81bf-0042e1554901" (UID: "ea84e439-1556-4df6-81bf-0042e1554901"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4792]: I0309 09:24:33.909417 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea84e439-1556-4df6-81bf-0042e1554901-kube-api-access-4sk9b" (OuterVolumeSpecName: "kube-api-access-4sk9b") pod "ea84e439-1556-4df6-81bf-0042e1554901" (UID: "ea84e439-1556-4df6-81bf-0042e1554901"). InnerVolumeSpecName "kube-api-access-4sk9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4792]: I0309 09:24:33.952045 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-mzpqx" Mar 09 09:24:33 crc kubenswrapper[4792]: I0309 09:24:33.968281 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea84e439-1556-4df6-81bf-0042e1554901-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea84e439-1556-4df6-81bf-0042e1554901" (UID: "ea84e439-1556-4df6-81bf-0042e1554901"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.005266 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea84e439-1556-4df6-81bf-0042e1554901-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.005492 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sk9b\" (UniqueName: \"kubernetes.io/projected/ea84e439-1556-4df6-81bf-0042e1554901-kube-api-access-4sk9b\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.005588 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea84e439-1556-4df6-81bf-0042e1554901-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.427736 4792 generic.go:334] "Generic (PLEG): container finished" podID="ea84e439-1556-4df6-81bf-0042e1554901" containerID="bd94222d429d2e6268f2c6a00365456caa95f949339c66f3da6e238f6f081aa1" exitCode=0 Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.427784 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdllp" Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.427804 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdllp" event={"ID":"ea84e439-1556-4df6-81bf-0042e1554901","Type":"ContainerDied","Data":"bd94222d429d2e6268f2c6a00365456caa95f949339c66f3da6e238f6f081aa1"} Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.429348 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdllp" event={"ID":"ea84e439-1556-4df6-81bf-0042e1554901","Type":"ContainerDied","Data":"2a7c64478844c0c74b40239ee9b87dab9512360a321006a5fc70d17b07d3aff4"} Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.429436 4792 scope.go:117] "RemoveContainer" containerID="bd94222d429d2e6268f2c6a00365456caa95f949339c66f3da6e238f6f081aa1" Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.448111 4792 scope.go:117] "RemoveContainer" containerID="c012c80daaf47bd044b3b3b9f4de1479df4b268dd237d1601f508140add9df10" Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.458817 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bdllp"] Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.468471 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bdllp"] Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.470647 4792 scope.go:117] "RemoveContainer" containerID="ebd7a8076890c7074f5edc74056da7c275b812b8eda600f67ba95f2f42f93a48" Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.491419 4792 scope.go:117] "RemoveContainer" containerID="bd94222d429d2e6268f2c6a00365456caa95f949339c66f3da6e238f6f081aa1" Mar 09 09:24:34 crc kubenswrapper[4792]: E0309 09:24:34.491803 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd94222d429d2e6268f2c6a00365456caa95f949339c66f3da6e238f6f081aa1\": container with ID starting with bd94222d429d2e6268f2c6a00365456caa95f949339c66f3da6e238f6f081aa1 not found: ID does not exist" containerID="bd94222d429d2e6268f2c6a00365456caa95f949339c66f3da6e238f6f081aa1" Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.491827 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd94222d429d2e6268f2c6a00365456caa95f949339c66f3da6e238f6f081aa1"} err="failed to get container status \"bd94222d429d2e6268f2c6a00365456caa95f949339c66f3da6e238f6f081aa1\": rpc error: code = NotFound desc = could not find container \"bd94222d429d2e6268f2c6a00365456caa95f949339c66f3da6e238f6f081aa1\": container with ID starting with bd94222d429d2e6268f2c6a00365456caa95f949339c66f3da6e238f6f081aa1 not found: ID does not exist" Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.491846 4792 scope.go:117] "RemoveContainer" containerID="c012c80daaf47bd044b3b3b9f4de1479df4b268dd237d1601f508140add9df10" Mar 09 09:24:34 crc kubenswrapper[4792]: E0309 09:24:34.492003 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c012c80daaf47bd044b3b3b9f4de1479df4b268dd237d1601f508140add9df10\": container with ID starting with c012c80daaf47bd044b3b3b9f4de1479df4b268dd237d1601f508140add9df10 not found: ID does not exist" containerID="c012c80daaf47bd044b3b3b9f4de1479df4b268dd237d1601f508140add9df10" Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.492018 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c012c80daaf47bd044b3b3b9f4de1479df4b268dd237d1601f508140add9df10"} err="failed to get container status \"c012c80daaf47bd044b3b3b9f4de1479df4b268dd237d1601f508140add9df10\": rpc error: code = NotFound desc = could not find container \"c012c80daaf47bd044b3b3b9f4de1479df4b268dd237d1601f508140add9df10\": container with ID starting with c012c80daaf47bd044b3b3b9f4de1479df4b268dd237d1601f508140add9df10 not found: ID does not exist" Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.492030 4792 scope.go:117] "RemoveContainer" containerID="ebd7a8076890c7074f5edc74056da7c275b812b8eda600f67ba95f2f42f93a48" Mar 09 09:24:34 crc kubenswrapper[4792]: E0309 09:24:34.492222 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd7a8076890c7074f5edc74056da7c275b812b8eda600f67ba95f2f42f93a48\": container with ID starting with ebd7a8076890c7074f5edc74056da7c275b812b8eda600f67ba95f2f42f93a48 not found: ID does not exist" containerID="ebd7a8076890c7074f5edc74056da7c275b812b8eda600f67ba95f2f42f93a48" Mar 09 09:24:34 crc kubenswrapper[4792]: I0309 09:24:34.492238 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd7a8076890c7074f5edc74056da7c275b812b8eda600f67ba95f2f42f93a48"} err="failed to get container status \"ebd7a8076890c7074f5edc74056da7c275b812b8eda600f67ba95f2f42f93a48\": rpc error: code = NotFound desc = could not find container \"ebd7a8076890c7074f5edc74056da7c275b812b8eda600f67ba95f2f42f93a48\": container with ID starting with ebd7a8076890c7074f5edc74056da7c275b812b8eda600f67ba95f2f42f93a48 not found: ID does not exist" Mar 09 09:24:35 crc kubenswrapper[4792]: I0309 09:24:35.672374 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea84e439-1556-4df6-81bf-0042e1554901" path="/var/lib/kubelet/pods/ea84e439-1556-4df6-81bf-0042e1554901/volumes" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.191653 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-2522t"] Mar 09 09:24:49 crc kubenswrapper[4792]: E0309 09:24:49.192482 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea84e439-1556-4df6-81bf-0042e1554901" containerName="registry-server" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.192494 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea84e439-1556-4df6-81bf-0042e1554901" containerName="registry-server" Mar 09 09:24:49 crc kubenswrapper[4792]: E0309 09:24:49.192512 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea84e439-1556-4df6-81bf-0042e1554901" containerName="extract-utilities" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.192519 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea84e439-1556-4df6-81bf-0042e1554901" containerName="extract-utilities" Mar 09 09:24:49 crc kubenswrapper[4792]: E0309 09:24:49.192531 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea84e439-1556-4df6-81bf-0042e1554901" containerName="extract-content" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.192537 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea84e439-1556-4df6-81bf-0042e1554901" containerName="extract-content" Mar 09 09:24:49 crc kubenswrapper[4792]: E0309 09:24:49.192546 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1eec157-3f1a-4ca8-afb9-7dc14b7bc433" containerName="oc" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.192552 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1eec157-3f1a-4ca8-afb9-7dc14b7bc433" containerName="oc" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.192694 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea84e439-1556-4df6-81bf-0042e1554901" containerName="registry-server" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.192710 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1eec157-3f1a-4ca8-afb9-7dc14b7bc433" containerName="oc" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.193490 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-2522t" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.202558 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-mdks4" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.202889 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.203104 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.203210 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.228454 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-2522t"] Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.252285 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-n4gp5"] Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.257346 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.263403 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.285890 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-n4gp5"] Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.317099 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b1d30a-41a4-4b18-a997-bc273133d3ab-config\") pod \"dnsmasq-dns-86bbd886cf-n4gp5\" (UID: \"12b1d30a-41a4-4b18-a997-bc273133d3ab\") " pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.317422 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3964aa2-1832-4f88-973b-7cad0595f31f-config\") pod \"dnsmasq-dns-589db6c89c-2522t\" (UID: \"a3964aa2-1832-4f88-973b-7cad0595f31f\") " pod="openstack/dnsmasq-dns-589db6c89c-2522t" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.317616 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfxrh\" (UniqueName: \"kubernetes.io/projected/a3964aa2-1832-4f88-973b-7cad0595f31f-kube-api-access-jfxrh\") pod \"dnsmasq-dns-589db6c89c-2522t\" (UID: \"a3964aa2-1832-4f88-973b-7cad0595f31f\") " pod="openstack/dnsmasq-dns-589db6c89c-2522t" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.317709 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12b1d30a-41a4-4b18-a997-bc273133d3ab-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-n4gp5\" (UID: \"12b1d30a-41a4-4b18-a997-bc273133d3ab\") " pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.317837 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9k8r\" (UniqueName: \"kubernetes.io/projected/12b1d30a-41a4-4b18-a997-bc273133d3ab-kube-api-access-c9k8r\") pod \"dnsmasq-dns-86bbd886cf-n4gp5\" (UID: \"12b1d30a-41a4-4b18-a997-bc273133d3ab\") " pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.419106 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3964aa2-1832-4f88-973b-7cad0595f31f-config\") pod \"dnsmasq-dns-589db6c89c-2522t\" (UID: \"a3964aa2-1832-4f88-973b-7cad0595f31f\") " pod="openstack/dnsmasq-dns-589db6c89c-2522t" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.419186 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfxrh\" (UniqueName: \"kubernetes.io/projected/a3964aa2-1832-4f88-973b-7cad0595f31f-kube-api-access-jfxrh\") pod \"dnsmasq-dns-589db6c89c-2522t\" (UID: \"a3964aa2-1832-4f88-973b-7cad0595f31f\") " pod="openstack/dnsmasq-dns-589db6c89c-2522t" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.419215 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12b1d30a-41a4-4b18-a997-bc273133d3ab-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-n4gp5\" (UID: \"12b1d30a-41a4-4b18-a997-bc273133d3ab\") " pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.419234 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9k8r\" (UniqueName: \"kubernetes.io/projected/12b1d30a-41a4-4b18-a997-bc273133d3ab-kube-api-access-c9k8r\") pod \"dnsmasq-dns-86bbd886cf-n4gp5\" (UID: \"12b1d30a-41a4-4b18-a997-bc273133d3ab\") " pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.419258 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b1d30a-41a4-4b18-a997-bc273133d3ab-config\") pod \"dnsmasq-dns-86bbd886cf-n4gp5\" (UID: \"12b1d30a-41a4-4b18-a997-bc273133d3ab\") " pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.420270 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b1d30a-41a4-4b18-a997-bc273133d3ab-config\") pod \"dnsmasq-dns-86bbd886cf-n4gp5\" (UID: \"12b1d30a-41a4-4b18-a997-bc273133d3ab\") " pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.420656 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12b1d30a-41a4-4b18-a997-bc273133d3ab-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-n4gp5\" (UID: \"12b1d30a-41a4-4b18-a997-bc273133d3ab\") " pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.422517 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3964aa2-1832-4f88-973b-7cad0595f31f-config\") pod \"dnsmasq-dns-589db6c89c-2522t\" (UID: \"a3964aa2-1832-4f88-973b-7cad0595f31f\") " pod="openstack/dnsmasq-dns-589db6c89c-2522t" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.443912 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfxrh\" (UniqueName: \"kubernetes.io/projected/a3964aa2-1832-4f88-973b-7cad0595f31f-kube-api-access-jfxrh\") pod \"dnsmasq-dns-589db6c89c-2522t\" (UID: \"a3964aa2-1832-4f88-973b-7cad0595f31f\") " pod="openstack/dnsmasq-dns-589db6c89c-2522t" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.443922 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9k8r\" (UniqueName: \"kubernetes.io/projected/12b1d30a-41a4-4b18-a997-bc273133d3ab-kube-api-access-c9k8r\") pod \"dnsmasq-dns-86bbd886cf-n4gp5\" (UID: \"12b1d30a-41a4-4b18-a997-bc273133d3ab\") " pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.515018 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-2522t" Mar 09 09:24:49 crc kubenswrapper[4792]: I0309 09:24:49.577420 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" Mar 09 09:24:50 crc kubenswrapper[4792]: I0309 09:24:50.004769 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-n4gp5"] Mar 09 09:24:50 crc kubenswrapper[4792]: I0309 09:24:50.117399 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-2522t"] Mar 09 09:24:50 crc kubenswrapper[4792]: W0309 09:24:50.126669 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3964aa2_1832_4f88_973b_7cad0595f31f.slice/crio-cca6af35a2199ca947e080a5436e0422d80bcc8673125630b216ab7628ad2bba WatchSource:0}: Error finding container cca6af35a2199ca947e080a5436e0422d80bcc8673125630b216ab7628ad2bba: Status 404 returned error can't find the container with id cca6af35a2199ca947e080a5436e0422d80bcc8673125630b216ab7628ad2bba Mar 09 09:24:50 crc kubenswrapper[4792]: I0309 09:24:50.538510 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" event={"ID":"12b1d30a-41a4-4b18-a997-bc273133d3ab","Type":"ContainerStarted","Data":"bd4359596d9822f360fd4d49e03f93d4eacbae5e00975f2dd8ba468aecb1a9ad"} Mar 09 09:24:50 crc kubenswrapper[4792]: I0309 09:24:50.540729 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-2522t" event={"ID":"a3964aa2-1832-4f88-973b-7cad0595f31f","Type":"ContainerStarted","Data":"cca6af35a2199ca947e080a5436e0422d80bcc8673125630b216ab7628ad2bba"} Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.249831 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-2522t"] Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.288560 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wbtzh"] Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.289829 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.311723 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wbtzh"] Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.373523 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b750127e-0708-4867-8cba-b31296edae00-config\") pod \"dnsmasq-dns-78cb4465c9-wbtzh\" (UID: \"b750127e-0708-4867-8cba-b31296edae00\") " pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.373836 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlc8t\" (UniqueName: \"kubernetes.io/projected/b750127e-0708-4867-8cba-b31296edae00-kube-api-access-vlc8t\") pod \"dnsmasq-dns-78cb4465c9-wbtzh\" (UID: \"b750127e-0708-4867-8cba-b31296edae00\") " pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.373875 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b750127e-0708-4867-8cba-b31296edae00-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-wbtzh\" (UID: \"b750127e-0708-4867-8cba-b31296edae00\") " pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.476865 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlc8t\" (UniqueName: \"kubernetes.io/projected/b750127e-0708-4867-8cba-b31296edae00-kube-api-access-vlc8t\") pod \"dnsmasq-dns-78cb4465c9-wbtzh\" (UID: \"b750127e-0708-4867-8cba-b31296edae00\") " pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.476947 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b750127e-0708-4867-8cba-b31296edae00-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-wbtzh\" (UID: \"b750127e-0708-4867-8cba-b31296edae00\") " pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.477034 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b750127e-0708-4867-8cba-b31296edae00-config\") pod \"dnsmasq-dns-78cb4465c9-wbtzh\" (UID: \"b750127e-0708-4867-8cba-b31296edae00\") " pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.478043 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b750127e-0708-4867-8cba-b31296edae00-config\") pod \"dnsmasq-dns-78cb4465c9-wbtzh\" (UID: \"b750127e-0708-4867-8cba-b31296edae00\") " pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.478640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b750127e-0708-4867-8cba-b31296edae00-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-wbtzh\" (UID: \"b750127e-0708-4867-8cba-b31296edae00\") " pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.504320 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlc8t\" (UniqueName: \"kubernetes.io/projected/b750127e-0708-4867-8cba-b31296edae00-kube-api-access-vlc8t\") pod \"dnsmasq-dns-78cb4465c9-wbtzh\" (UID: \"b750127e-0708-4867-8cba-b31296edae00\") " pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.612413 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.642447 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-n4gp5"] Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.688829 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-vvmxj"] Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.689921 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.756299 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-vvmxj"] Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.788112 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0acc36a0-f564-4683-a8d5-d76348b1cf3f-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-vvmxj\" (UID: \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.788163 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0acc36a0-f564-4683-a8d5-d76348b1cf3f-config\") pod \"dnsmasq-dns-7c47bcb9f9-vvmxj\" (UID: \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.788204 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wszxb\" (UniqueName: \"kubernetes.io/projected/0acc36a0-f564-4683-a8d5-d76348b1cf3f-kube-api-access-wszxb\") pod \"dnsmasq-dns-7c47bcb9f9-vvmxj\" (UID: \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.889319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wszxb\" (UniqueName: \"kubernetes.io/projected/0acc36a0-f564-4683-a8d5-d76348b1cf3f-kube-api-access-wszxb\") pod \"dnsmasq-dns-7c47bcb9f9-vvmxj\" (UID: \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.889421 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0acc36a0-f564-4683-a8d5-d76348b1cf3f-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-vvmxj\" (UID: \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.889449 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0acc36a0-f564-4683-a8d5-d76348b1cf3f-config\") pod \"dnsmasq-dns-7c47bcb9f9-vvmxj\" (UID: \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.890206 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0acc36a0-f564-4683-a8d5-d76348b1cf3f-config\") pod \"dnsmasq-dns-7c47bcb9f9-vvmxj\" (UID: \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.890947 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0acc36a0-f564-4683-a8d5-d76348b1cf3f-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-vvmxj\" (UID: \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" Mar 09 09:24:52 crc kubenswrapper[4792]: I0309 09:24:52.933124 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wszxb\" (UniqueName: \"kubernetes.io/projected/0acc36a0-f564-4683-a8d5-d76348b1cf3f-kube-api-access-wszxb\") pod \"dnsmasq-dns-7c47bcb9f9-vvmxj\" (UID: \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.088039 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.411543 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wbtzh"] Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.453373 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.454686 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.459522 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.461155 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.461521 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.461783 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.462047 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-94bgh" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.462276 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.476205 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.476554 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.625941 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.626022 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee86e97-a22c-4089-9ce4-363cb0571173-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.626048 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nzbf\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-kube-api-access-8nzbf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.626114 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.626177 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.626281 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.626336 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.626431 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee86e97-a22c-4089-9ce4-363cb0571173-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.626660 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.626688 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.626709 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.728435 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.728508 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee86e97-a22c-4089-9ce4-363cb0571173-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.728536 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nzbf\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-kube-api-access-8nzbf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.728565 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.728626 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.728786 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.728815 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.729819 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.730237 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.731176 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.731381 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.731558 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee86e97-a22c-4089-9ce4-363cb0571173-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.732055 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.732162 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.732289 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.732909 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.735023 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee86e97-a22c-4089-9ce4-363cb0571173-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.735447 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.736430 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.750867 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee86e97-a22c-4089-9ce4-363cb0571173-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.751702 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.757942 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nzbf\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-kube-api-access-8nzbf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.778525 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.826599 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.856168 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.857295 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.860460 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.860592 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.860707 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.860807 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-96c2v" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.860940 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.861053 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.861591 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.879171 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:24:53 crc kubenswrapper[4792]: W0309 09:24:53.923570 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb750127e_0708_4867_8cba_b31296edae00.slice/crio-dc5d3a28135d82dfe0b6378840df5f0c938f3ea74925957163ae9ddada17a129 WatchSource:0}: Error finding container dc5d3a28135d82dfe0b6378840df5f0c938f3ea74925957163ae9ddada17a129: Status 404 returned error can't find the container with id dc5d3a28135d82dfe0b6378840df5f0c938f3ea74925957163ae9ddada17a129 Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.937689 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.937737 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.937786 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.937827 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm6kj\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-kube-api-access-fm6kj\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.937854 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.937873 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.937905 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-pod-info\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.937954 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.937981 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-config-data\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.937995 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:53 crc kubenswrapper[4792]: I0309 09:24:53.938015 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-server-conf\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.040988 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm6kj\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-kube-api-access-fm6kj\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.041047 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.041095 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.041110 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-pod-info\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.041156 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.041187 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-config-data\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.041223 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.041249 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-server-conf\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.041282 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.041309 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.041360 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.045183 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.047187 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-config-data\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.051029 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-pod-info\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.051584 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.053693 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-server-conf\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.055858 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.056512 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.058601 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.061987 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.062475 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.062949 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm6kj\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-kube-api-access-fm6kj\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.107551 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.274600 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.399389 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:24:54 crc kubenswrapper[4792]: W0309 09:24:54.445894 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ee86e97_a22c_4089_9ce4_363cb0571173.slice/crio-447d895ca227edf5d86a3419d838c6872239e8f426e3c2b57ff42f8e6ff1964f WatchSource:0}: Error finding container 447d895ca227edf5d86a3419d838c6872239e8f426e3c2b57ff42f8e6ff1964f: Status 404 returned error can't find the container with id 447d895ca227edf5d86a3419d838c6872239e8f426e3c2b57ff42f8e6ff1964f Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.580088 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ee86e97-a22c-4089-9ce4-363cb0571173","Type":"ContainerStarted","Data":"447d895ca227edf5d86a3419d838c6872239e8f426e3c2b57ff42f8e6ff1964f"} Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.588090 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" event={"ID":"b750127e-0708-4867-8cba-b31296edae00","Type":"ContainerStarted","Data":"dc5d3a28135d82dfe0b6378840df5f0c938f3ea74925957163ae9ddada17a129"} Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.726659 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-vvmxj"] Mar 09 09:24:54 crc kubenswrapper[4792]: I0309 09:24:54.878942 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.187192 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.188935 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.193507 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-7cvc7" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.193806 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.195827 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.205834 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.206115 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.209916 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.272584 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.272648 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.272679 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.272727 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-config-data-default\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.272759 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.272819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nlxd\" (UniqueName: \"kubernetes.io/projected/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-kube-api-access-7nlxd\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.272844 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-kolla-config\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.272893 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.373300 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-kolla-config\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.373696 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.373741 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.373770 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.373791 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.373849 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-config-data-default\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.373875 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.374099 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nlxd\" (UniqueName: \"kubernetes.io/projected/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-kube-api-access-7nlxd\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.374894 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-kolla-config\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.375460 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.375633 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.375704 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.392178 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.404984 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.405511 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-config-data-default\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.410910 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.417042 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nlxd\" (UniqueName: \"kubernetes.io/projected/7dd0ce66-42bf-4c00-8e99-3c58defcc87f-kube-api-access-7nlxd\") pod \"openstack-galera-0\" (UID: \"7dd0ce66-42bf-4c00-8e99-3c58defcc87f\") " pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.524373 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.606560 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"42b40fb0-d2c9-4bc2-a13f-4c099b244ced","Type":"ContainerStarted","Data":"591d12d177417a851e27ab947e142a016d5d0839679e66795a8e5a5cc9c7798a"} Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.611903 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" event={"ID":"0acc36a0-f564-4683-a8d5-d76348b1cf3f","Type":"ContainerStarted","Data":"e7d85deacd14b93c3ab238292c1cdd06af0a418eabc2cc6bfad0f4f408a75f98"} Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.952728 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.954713 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.959618 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-k8w6x" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.959843 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.972211 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.972291 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 09 09:24:55 crc kubenswrapper[4792]: I0309 09:24:55.978090 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.090937 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1731fe55-4bf2-4410-85f9-58124ed652c9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.091002 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1731fe55-4bf2-4410-85f9-58124ed652c9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.091034 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.091093 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1731fe55-4bf2-4410-85f9-58124ed652c9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.091126 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1731fe55-4bf2-4410-85f9-58124ed652c9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.091157 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1731fe55-4bf2-4410-85f9-58124ed652c9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.091198 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw7jc\" (UniqueName: \"kubernetes.io/projected/1731fe55-4bf2-4410-85f9-58124ed652c9-kube-api-access-cw7jc\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.091228 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1731fe55-4bf2-4410-85f9-58124ed652c9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.135465 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 09:24:56 crc kubenswrapper[4792]: W0309 09:24:56.165329 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dd0ce66_42bf_4c00_8e99_3c58defcc87f.slice/crio-52da7b6d637ba90c8a7bac11cc7256b41d2a073f4bbe49f7e8e5f2789af3fd6e WatchSource:0}: Error finding container 52da7b6d637ba90c8a7bac11cc7256b41d2a073f4bbe49f7e8e5f2789af3fd6e: Status 404 returned error can't find the container with id 52da7b6d637ba90c8a7bac11cc7256b41d2a073f4bbe49f7e8e5f2789af3fd6e Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.192896 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1731fe55-4bf2-4410-85f9-58124ed652c9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.192987 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw7jc\" (UniqueName: \"kubernetes.io/projected/1731fe55-4bf2-4410-85f9-58124ed652c9-kube-api-access-cw7jc\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.193019 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1731fe55-4bf2-4410-85f9-58124ed652c9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.193062 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1731fe55-4bf2-4410-85f9-58124ed652c9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.193115 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1731fe55-4bf2-4410-85f9-58124ed652c9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.193148 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.193195 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1731fe55-4bf2-4410-85f9-58124ed652c9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.193230 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1731fe55-4bf2-4410-85f9-58124ed652c9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.194124 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1731fe55-4bf2-4410-85f9-58124ed652c9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.195450 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.195978 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1731fe55-4bf2-4410-85f9-58124ed652c9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.196303 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1731fe55-4bf2-4410-85f9-58124ed652c9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.197025 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1731fe55-4bf2-4410-85f9-58124ed652c9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.206994 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1731fe55-4bf2-4410-85f9-58124ed652c9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.219997 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1731fe55-4bf2-4410-85f9-58124ed652c9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.231469 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw7jc\" (UniqueName: \"kubernetes.io/projected/1731fe55-4bf2-4410-85f9-58124ed652c9-kube-api-access-cw7jc\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.278189 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1731fe55-4bf2-4410-85f9-58124ed652c9\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.306756 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.344148 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.345789 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.349699 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4fxrj" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.349899 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.350322 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.423384 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.496817 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22afdfd4-ea58-4efb-b316-bcb40c906952-combined-ca-bundle\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.496939 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22afdfd4-ea58-4efb-b316-bcb40c906952-kolla-config\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.497000 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/22afdfd4-ea58-4efb-b316-bcb40c906952-memcached-tls-certs\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.497159 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sksvd\" (UniqueName: \"kubernetes.io/projected/22afdfd4-ea58-4efb-b316-bcb40c906952-kube-api-access-sksvd\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.497196 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22afdfd4-ea58-4efb-b316-bcb40c906952-config-data\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.599333 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22afdfd4-ea58-4efb-b316-bcb40c906952-combined-ca-bundle\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.600238 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22afdfd4-ea58-4efb-b316-bcb40c906952-kolla-config\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.600280 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/22afdfd4-ea58-4efb-b316-bcb40c906952-memcached-tls-certs\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.600742 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sksvd\" (UniqueName: \"kubernetes.io/projected/22afdfd4-ea58-4efb-b316-bcb40c906952-kube-api-access-sksvd\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.600789 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22afdfd4-ea58-4efb-b316-bcb40c906952-config-data\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.602007 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22afdfd4-ea58-4efb-b316-bcb40c906952-config-data\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.602750 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/22afdfd4-ea58-4efb-b316-bcb40c906952-kolla-config\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.603865 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22afdfd4-ea58-4efb-b316-bcb40c906952-combined-ca-bundle\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.611849 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/22afdfd4-ea58-4efb-b316-bcb40c906952-memcached-tls-certs\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.634709 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sksvd\" (UniqueName: \"kubernetes.io/projected/22afdfd4-ea58-4efb-b316-bcb40c906952-kube-api-access-sksvd\") pod \"memcached-0\" (UID: \"22afdfd4-ea58-4efb-b316-bcb40c906952\") " pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.645242 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7dd0ce66-42bf-4c00-8e99-3c58defcc87f","Type":"ContainerStarted","Data":"52da7b6d637ba90c8a7bac11cc7256b41d2a073f4bbe49f7e8e5f2789af3fd6e"} Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.685953 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 09:24:56 crc kubenswrapper[4792]: I0309 09:24:56.861519 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 09:24:57 crc kubenswrapper[4792]: I0309 09:24:57.618978 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 09:24:57 crc kubenswrapper[4792]: I0309 09:24:57.704706 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1731fe55-4bf2-4410-85f9-58124ed652c9","Type":"ContainerStarted","Data":"aad4b77521497e1c03d30ebe3be305d9d48c95b4417a70954e2851459e2859df"} Mar 09 09:24:58 crc kubenswrapper[4792]: I0309 09:24:58.503793 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:24:58 crc kubenswrapper[4792]: I0309 09:24:58.504879 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 09:24:58 crc kubenswrapper[4792]: I0309 09:24:58.511992 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-z4q5c" Mar 09 09:24:58 crc kubenswrapper[4792]: I0309 09:24:58.537126 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:24:58 crc kubenswrapper[4792]: I0309 09:24:58.667578 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g98j9\" (UniqueName: \"kubernetes.io/projected/b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d-kube-api-access-g98j9\") pod \"kube-state-metrics-0\" (UID: \"b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d\") " pod="openstack/kube-state-metrics-0" Mar 09 09:24:58 crc kubenswrapper[4792]: I0309 09:24:58.769288 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g98j9\" (UniqueName: \"kubernetes.io/projected/b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d-kube-api-access-g98j9\") pod \"kube-state-metrics-0\" (UID: \"b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d\") " pod="openstack/kube-state-metrics-0" Mar 09 09:24:58 crc kubenswrapper[4792]: I0309 09:24:58.794908 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g98j9\" (UniqueName: \"kubernetes.io/projected/b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d-kube-api-access-g98j9\") pod \"kube-state-metrics-0\" (UID: \"b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d\") " pod="openstack/kube-state-metrics-0" Mar 09 09:24:58 crc kubenswrapper[4792]: I0309 09:24:58.843028 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.609367 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kj9d8"] Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.611301 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.615651 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-kjjd9" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.616126 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.616465 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.635744 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kj9d8"] Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.653651 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-gw65t"] Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.655365 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.704759 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gw65t"] Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.767808 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/438d928b-7565-4fe1-a005-2c6402835edf-ovn-controller-tls-certs\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.767862 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpk5t\" (UniqueName: \"kubernetes.io/projected/2fd40118-2613-4e01-a557-f7fc5f24e07c-kube-api-access-wpk5t\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.767902 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fd40118-2613-4e01-a557-f7fc5f24e07c-scripts\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.767927 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2fd40118-2613-4e01-a557-f7fc5f24e07c-var-run\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.767941 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438d928b-7565-4fe1-a005-2c6402835edf-combined-ca-bundle\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.767960 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2fd40118-2613-4e01-a557-f7fc5f24e07c-var-lib\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.767978 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lmpf\" (UniqueName: \"kubernetes.io/projected/438d928b-7565-4fe1-a005-2c6402835edf-kube-api-access-7lmpf\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.767995 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/438d928b-7565-4fe1-a005-2c6402835edf-var-log-ovn\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.768013 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/438d928b-7565-4fe1-a005-2c6402835edf-scripts\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.768043 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/438d928b-7565-4fe1-a005-2c6402835edf-var-run\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.768086 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2fd40118-2613-4e01-a557-f7fc5f24e07c-var-log\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.768116 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/438d928b-7565-4fe1-a005-2c6402835edf-var-run-ovn\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.768129 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2fd40118-2613-4e01-a557-f7fc5f24e07c-etc-ovs\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.869624 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/438d928b-7565-4fe1-a005-2c6402835edf-var-run\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.869677 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2fd40118-2613-4e01-a557-f7fc5f24e07c-var-log\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.869722 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/438d928b-7565-4fe1-a005-2c6402835edf-var-run-ovn\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.869737 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2fd40118-2613-4e01-a557-f7fc5f24e07c-etc-ovs\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.869771 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/438d928b-7565-4fe1-a005-2c6402835edf-ovn-controller-tls-certs\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.869804 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpk5t\" (UniqueName: \"kubernetes.io/projected/2fd40118-2613-4e01-a557-f7fc5f24e07c-kube-api-access-wpk5t\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.869859 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fd40118-2613-4e01-a557-f7fc5f24e07c-scripts\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.869880 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2fd40118-2613-4e01-a557-f7fc5f24e07c-var-run\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.869894 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438d928b-7565-4fe1-a005-2c6402835edf-combined-ca-bundle\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.869910 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2fd40118-2613-4e01-a557-f7fc5f24e07c-var-lib\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.869929 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lmpf\" (UniqueName: \"kubernetes.io/projected/438d928b-7565-4fe1-a005-2c6402835edf-kube-api-access-7lmpf\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.869945 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/438d928b-7565-4fe1-a005-2c6402835edf-var-log-ovn\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.869960 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/438d928b-7565-4fe1-a005-2c6402835edf-scripts\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.870225 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/438d928b-7565-4fe1-a005-2c6402835edf-var-run\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.870313 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2fd40118-2613-4e01-a557-f7fc5f24e07c-var-log\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.870799 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/438d928b-7565-4fe1-a005-2c6402835edf-var-run-ovn\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.870831 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2fd40118-2613-4e01-a557-f7fc5f24e07c-var-lib\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.870925 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/438d928b-7565-4fe1-a005-2c6402835edf-var-log-ovn\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.871190 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2fd40118-2613-4e01-a557-f7fc5f24e07c-etc-ovs\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.871240 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2fd40118-2613-4e01-a557-f7fc5f24e07c-var-run\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.871995 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/438d928b-7565-4fe1-a005-2c6402835edf-scripts\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.872800 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fd40118-2613-4e01-a557-f7fc5f24e07c-scripts\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.880279 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438d928b-7565-4fe1-a005-2c6402835edf-combined-ca-bundle\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.883576 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/438d928b-7565-4fe1-a005-2c6402835edf-ovn-controller-tls-certs\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.897594 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lmpf\" (UniqueName: \"kubernetes.io/projected/438d928b-7565-4fe1-a005-2c6402835edf-kube-api-access-7lmpf\") pod \"ovn-controller-kj9d8\" (UID: \"438d928b-7565-4fe1-a005-2c6402835edf\") " pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.925600 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.926799 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.933576 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.933760 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-bww52" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.933876 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.934168 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.934299 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.936084 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpk5t\" (UniqueName: \"kubernetes.io/projected/2fd40118-2613-4e01-a557-f7fc5f24e07c-kube-api-access-wpk5t\") pod \"ovn-controller-ovs-gw65t\" (UID: \"2fd40118-2613-4e01-a557-f7fc5f24e07c\") " pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.952839 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 09:25:01 crc kubenswrapper[4792]: I0309 09:25:01.965499 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.023625 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.079406 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02a198ef-826d-49cf-a6c5-134da45ad28b-config\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.079628 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a198ef-826d-49cf-a6c5-134da45ad28b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.079668 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49p65\" (UniqueName: \"kubernetes.io/projected/02a198ef-826d-49cf-a6c5-134da45ad28b-kube-api-access-49p65\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.084389 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a198ef-826d-49cf-a6c5-134da45ad28b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.084468 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02a198ef-826d-49cf-a6c5-134da45ad28b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.084531 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a198ef-826d-49cf-a6c5-134da45ad28b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.084560 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.084704 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02a198ef-826d-49cf-a6c5-134da45ad28b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.187876 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a198ef-826d-49cf-a6c5-134da45ad28b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.187922 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49p65\" (UniqueName: \"kubernetes.io/projected/02a198ef-826d-49cf-a6c5-134da45ad28b-kube-api-access-49p65\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.187967 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a198ef-826d-49cf-a6c5-134da45ad28b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.188006 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02a198ef-826d-49cf-a6c5-134da45ad28b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.188028 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a198ef-826d-49cf-a6c5-134da45ad28b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.188043 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.188121 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02a198ef-826d-49cf-a6c5-134da45ad28b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.188155 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02a198ef-826d-49cf-a6c5-134da45ad28b-config\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.189349 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02a198ef-826d-49cf-a6c5-134da45ad28b-config\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.191014 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.192299 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02a198ef-826d-49cf-a6c5-134da45ad28b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.194686 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02a198ef-826d-49cf-a6c5-134da45ad28b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.196791 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02a198ef-826d-49cf-a6c5-134da45ad28b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.199599 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a198ef-826d-49cf-a6c5-134da45ad28b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.209215 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49p65\" (UniqueName: \"kubernetes.io/projected/02a198ef-826d-49cf-a6c5-134da45ad28b-kube-api-access-49p65\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.221134 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02a198ef-826d-49cf-a6c5-134da45ad28b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.259460 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"02a198ef-826d-49cf-a6c5-134da45ad28b\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:02 crc kubenswrapper[4792]: I0309 09:25:02.289623 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.574190 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.576138 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.578797 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-t6wsp" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.579166 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.579343 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.579556 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.588477 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.668410 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.668526 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.668566 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.668585 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-config\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.668628 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.668661 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b66g9\" (UniqueName: \"kubernetes.io/projected/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-kube-api-access-b66g9\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.668686 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.668719 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.770469 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.770527 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-config\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.770590 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.770627 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b66g9\" (UniqueName: \"kubernetes.io/projected/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-kube-api-access-b66g9\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.770655 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.770721 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.770765 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.770820 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.772029 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.772636 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.774412 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-config\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.774409 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.777063 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.778366 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.779384 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.793081 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b66g9\" (UniqueName: \"kubernetes.io/projected/b99fdd60-0b01-4b3e-ad0b-0f32f7427f48-kube-api-access-b66g9\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.799437 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.816007 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"22afdfd4-ea58-4efb-b316-bcb40c906952","Type":"ContainerStarted","Data":"391d7a33e7f280ad497714d1de5a5f29c0caf207af55fe04bb2399963ffbb4b2"} Mar 09 09:25:05 crc kubenswrapper[4792]: I0309 09:25:05.901019 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:06 crc kubenswrapper[4792]: I0309 09:25:06.369814 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kj9d8"] Mar 09 09:25:15 crc kubenswrapper[4792]: E0309 09:25:15.965284 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:9e7397d61095b02a8c1deb24bca874bc0032aa18019f12d53e0eda8998b85447" Mar 09 09:25:15 crc kubenswrapper[4792]: E0309 09:25:15.966009 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:9e7397d61095b02a8c1deb24bca874bc0032aa18019f12d53e0eda8998b85447,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8nzbf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(0ee86e97-a22c-4089-9ce4-363cb0571173): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:25:15 crc kubenswrapper[4792]: E0309 09:25:15.967285 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0ee86e97-a22c-4089-9ce4-363cb0571173" Mar 09 09:25:16 crc kubenswrapper[4792]: E0309 09:25:16.903125 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:9e7397d61095b02a8c1deb24bca874bc0032aa18019f12d53e0eda8998b85447\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0ee86e97-a22c-4089-9ce4-363cb0571173" Mar 09 09:25:18 crc kubenswrapper[4792]: W0309 09:25:18.606345 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod438d928b_7565_4fe1_a005_2c6402835edf.slice/crio-c6fb741a76b7ac91460d55adbdeee8101518dc08a72c9c2c226975551eff9453 WatchSource:0}: Error finding container c6fb741a76b7ac91460d55adbdeee8101518dc08a72c9c2c226975551eff9453: Status 404 returned error can't find the container with id c6fb741a76b7ac91460d55adbdeee8101518dc08a72c9c2c226975551eff9453 Mar 09 09:25:18 crc kubenswrapper[4792]: E0309 09:25:18.636326 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:9e7397d61095b02a8c1deb24bca874bc0032aa18019f12d53e0eda8998b85447" Mar 09 09:25:18 crc kubenswrapper[4792]: E0309 09:25:18.636512 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:9e7397d61095b02a8c1deb24bca874bc0032aa18019f12d53e0eda8998b85447,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fm6kj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(42b40fb0-d2c9-4bc2-a13f-4c099b244ced): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:25:18 crc kubenswrapper[4792]: E0309 09:25:18.637843 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="42b40fb0-d2c9-4bc2-a13f-4c099b244ced" Mar 09 09:25:18 crc kubenswrapper[4792]: E0309 09:25:18.640242 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa" Mar 09 09:25:18 crc kubenswrapper[4792]: E0309 09:25:18.640366 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7nlxd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(7dd0ce66-42bf-4c00-8e99-3c58defcc87f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:25:18 crc kubenswrapper[4792]: E0309 09:25:18.641494 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="7dd0ce66-42bf-4c00-8e99-3c58defcc87f" Mar 09 09:25:18 crc kubenswrapper[4792]: I0309 09:25:18.911888 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kj9d8" event={"ID":"438d928b-7565-4fe1-a005-2c6402835edf","Type":"ContainerStarted","Data":"c6fb741a76b7ac91460d55adbdeee8101518dc08a72c9c2c226975551eff9453"} Mar 09 09:25:18 crc kubenswrapper[4792]: E0309 09:25:18.913883 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa\\\"\"" pod="openstack/openstack-galera-0" podUID="7dd0ce66-42bf-4c00-8e99-3c58defcc87f" Mar 09 09:25:18 crc kubenswrapper[4792]: E0309 09:25:18.915735 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:9e7397d61095b02a8c1deb24bca874bc0032aa18019f12d53e0eda8998b85447\\\"\"" pod="openstack/rabbitmq-server-0" podUID="42b40fb0-d2c9-4bc2-a13f-4c099b244ced" Mar 09 09:25:23 crc kubenswrapper[4792]: E0309 09:25:23.717881 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa" Mar 09 09:25:23 crc kubenswrapper[4792]: E0309 09:25:23.718474 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cw7jc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(1731fe55-4bf2-4410-85f9-58124ed652c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:25:23 crc kubenswrapper[4792]: E0309 09:25:23.719930 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="1731fe55-4bf2-4410-85f9-58124ed652c9" Mar 09 09:25:23 crc kubenswrapper[4792]: E0309 09:25:23.963662 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="1731fe55-4bf2-4410-85f9-58124ed652c9" Mar 09 09:25:24 crc kubenswrapper[4792]: E0309 09:25:24.525659 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached@sha256:f434d78bf81ef3a2087c435011ff995697fc8e53555ba27c2b8d2425e38bda44" Mar 09 09:25:24 crc kubenswrapper[4792]: E0309 09:25:24.525937 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached@sha256:f434d78bf81ef3a2087c435011ff995697fc8e53555ba27c2b8d2425e38bda44,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:nbh596hbbh57bhf4h56fh578h7fhf7h59fh57h564h654h684h594h66fh586h555h5bch667h5dfh58bhddh697hffh79h587h689h699h666hbdh5d4q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sksvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(22afdfd4-ea58-4efb-b316-bcb40c906952): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:25:24 crc kubenswrapper[4792]: E0309 09:25:24.527131 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="22afdfd4-ea58-4efb-b316-bcb40c906952" Mar 09 09:25:24 crc kubenswrapper[4792]: E0309 09:25:24.968852 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached@sha256:f434d78bf81ef3a2087c435011ff995697fc8e53555ba27c2b8d2425e38bda44\\\"\"" pod="openstack/memcached-0" podUID="22afdfd4-ea58-4efb-b316-bcb40c906952" Mar 09 09:25:25 crc kubenswrapper[4792]: I0309 09:25:25.429716 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 09:25:25 crc kubenswrapper[4792]: W0309 09:25:25.802049 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02a198ef_826d_49cf_a6c5_134da45ad28b.slice/crio-c320caeea93bd14d207b2e7cc8e89c7a7885e4ac730a72d1e690b88281123fef WatchSource:0}: Error finding container c320caeea93bd14d207b2e7cc8e89c7a7885e4ac730a72d1e690b88281123fef: Status 404 returned error can't find the container with id c320caeea93bd14d207b2e7cc8e89c7a7885e4ac730a72d1e690b88281123fef Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.842554 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.842695 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jfxrh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-589db6c89c-2522t_openstack(a3964aa2-1832-4f88-973b-7cad0595f31f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.844030 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-589db6c89c-2522t" podUID="a3964aa2-1832-4f88-973b-7cad0595f31f" Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.868639 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.868805 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wszxb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7c47bcb9f9-vvmxj_openstack(0acc36a0-f564-4683-a8d5-d76348b1cf3f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.869744 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.869923 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c9k8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-n4gp5_openstack(12b1d30a-41a4-4b18-a997-bc273133d3ab): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.869984 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" podUID="0acc36a0-f564-4683-a8d5-d76348b1cf3f" Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.871622 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" podUID="12b1d30a-41a4-4b18-a997-bc273133d3ab" Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.945183 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.945615 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vlc8t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78cb4465c9-wbtzh_openstack(b750127e-0708-4867-8cba-b31296edae00): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.947721 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" podUID="b750127e-0708-4867-8cba-b31296edae00" Mar 09 09:25:25 crc kubenswrapper[4792]: I0309 09:25:25.980520 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"02a198ef-826d-49cf-a6c5-134da45ad28b","Type":"ContainerStarted","Data":"c320caeea93bd14d207b2e7cc8e89c7a7885e4ac730a72d1e690b88281123fef"} Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.982845 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" podUID="0acc36a0-f564-4683-a8d5-d76348b1cf3f" Mar 09 09:25:25 crc kubenswrapper[4792]: E0309 09:25:25.983154 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" podUID="b750127e-0708-4867-8cba-b31296edae00" Mar 09 09:25:26 crc kubenswrapper[4792]: I0309 09:25:26.383835 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:25:26 crc kubenswrapper[4792]: I0309 09:25:26.454815 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 09:25:26 crc kubenswrapper[4792]: I0309 09:25:26.566895 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gw65t"] Mar 09 09:25:28 crc kubenswrapper[4792]: W0309 09:25:28.078492 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb99fdd60_0b01_4b3e_ad0b_0f32f7427f48.slice/crio-5212e35ac0e6006f5472acaa430a56d8174f28732718cdc64d1e32a7c6bafc9e WatchSource:0}: Error finding container 5212e35ac0e6006f5472acaa430a56d8174f28732718cdc64d1e32a7c6bafc9e: Status 404 returned error can't find the container with id 5212e35ac0e6006f5472acaa430a56d8174f28732718cdc64d1e32a7c6bafc9e Mar 09 09:25:28 crc kubenswrapper[4792]: W0309 09:25:28.093400 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fd40118_2613_4e01_a557_f7fc5f24e07c.slice/crio-a5d07733ecd9cb6fbc3c29cfc424e416ca0fe6f2dbf0cceccbc5e0ac669f0135 WatchSource:0}: Error finding container a5d07733ecd9cb6fbc3c29cfc424e416ca0fe6f2dbf0cceccbc5e0ac669f0135: Status 404 returned error can't find the container with id a5d07733ecd9cb6fbc3c29cfc424e416ca0fe6f2dbf0cceccbc5e0ac669f0135 Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.166299 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-2522t" Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.252118 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.289697 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfxrh\" (UniqueName: \"kubernetes.io/projected/a3964aa2-1832-4f88-973b-7cad0595f31f-kube-api-access-jfxrh\") pod \"a3964aa2-1832-4f88-973b-7cad0595f31f\" (UID: \"a3964aa2-1832-4f88-973b-7cad0595f31f\") " Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.289770 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9k8r\" (UniqueName: \"kubernetes.io/projected/12b1d30a-41a4-4b18-a997-bc273133d3ab-kube-api-access-c9k8r\") pod \"12b1d30a-41a4-4b18-a997-bc273133d3ab\" (UID: \"12b1d30a-41a4-4b18-a997-bc273133d3ab\") " Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.289795 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3964aa2-1832-4f88-973b-7cad0595f31f-config\") pod \"a3964aa2-1832-4f88-973b-7cad0595f31f\" (UID: \"a3964aa2-1832-4f88-973b-7cad0595f31f\") " Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.289937 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b1d30a-41a4-4b18-a997-bc273133d3ab-config\") pod \"12b1d30a-41a4-4b18-a997-bc273133d3ab\" (UID: \"12b1d30a-41a4-4b18-a997-bc273133d3ab\") " Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.289959 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12b1d30a-41a4-4b18-a997-bc273133d3ab-dns-svc\") pod \"12b1d30a-41a4-4b18-a997-bc273133d3ab\" (UID: \"12b1d30a-41a4-4b18-a997-bc273133d3ab\") " Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.291309 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3964aa2-1832-4f88-973b-7cad0595f31f-config" (OuterVolumeSpecName: "config") pod "a3964aa2-1832-4f88-973b-7cad0595f31f" (UID: "a3964aa2-1832-4f88-973b-7cad0595f31f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.292908 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b1d30a-41a4-4b18-a997-bc273133d3ab-config" (OuterVolumeSpecName: "config") pod "12b1d30a-41a4-4b18-a997-bc273133d3ab" (UID: "12b1d30a-41a4-4b18-a997-bc273133d3ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.293673 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12b1d30a-41a4-4b18-a997-bc273133d3ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "12b1d30a-41a4-4b18-a997-bc273133d3ab" (UID: "12b1d30a-41a4-4b18-a997-bc273133d3ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.294508 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3964aa2-1832-4f88-973b-7cad0595f31f-kube-api-access-jfxrh" (OuterVolumeSpecName: "kube-api-access-jfxrh") pod "a3964aa2-1832-4f88-973b-7cad0595f31f" (UID: "a3964aa2-1832-4f88-973b-7cad0595f31f"). InnerVolumeSpecName "kube-api-access-jfxrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.295648 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b1d30a-41a4-4b18-a997-bc273133d3ab-kube-api-access-c9k8r" (OuterVolumeSpecName: "kube-api-access-c9k8r") pod "12b1d30a-41a4-4b18-a997-bc273133d3ab" (UID: "12b1d30a-41a4-4b18-a997-bc273133d3ab"). InnerVolumeSpecName "kube-api-access-c9k8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.392641 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b1d30a-41a4-4b18-a997-bc273133d3ab-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.392687 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12b1d30a-41a4-4b18-a997-bc273133d3ab-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.392701 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfxrh\" (UniqueName: \"kubernetes.io/projected/a3964aa2-1832-4f88-973b-7cad0595f31f-kube-api-access-jfxrh\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.392711 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9k8r\" (UniqueName: \"kubernetes.io/projected/12b1d30a-41a4-4b18-a997-bc273133d3ab-kube-api-access-c9k8r\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:28 crc kubenswrapper[4792]: I0309 09:25:28.392721 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3964aa2-1832-4f88-973b-7cad0595f31f-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.021961 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" event={"ID":"12b1d30a-41a4-4b18-a997-bc273133d3ab","Type":"ContainerDied","Data":"bd4359596d9822f360fd4d49e03f93d4eacbae5e00975f2dd8ba468aecb1a9ad"} Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.021990 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-n4gp5" Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.024168 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d","Type":"ContainerStarted","Data":"943f255292d634f95fab9cfa91dfa16cf2dee232ee8164544178b2da4eec4c84"} Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.025421 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gw65t" event={"ID":"2fd40118-2613-4e01-a557-f7fc5f24e07c","Type":"ContainerStarted","Data":"a5d07733ecd9cb6fbc3c29cfc424e416ca0fe6f2dbf0cceccbc5e0ac669f0135"} Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.032449 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kj9d8" event={"ID":"438d928b-7565-4fe1-a005-2c6402835edf","Type":"ContainerStarted","Data":"42fa2481af17ec09160358b51708ed3b14aaa9a654a75eba70a0c616ec66930a"} Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.033928 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-kj9d8" Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.041279 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-2522t" event={"ID":"a3964aa2-1832-4f88-973b-7cad0595f31f","Type":"ContainerDied","Data":"cca6af35a2199ca947e080a5436e0422d80bcc8673125630b216ab7628ad2bba"} Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.041375 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-2522t" Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.056395 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48","Type":"ContainerStarted","Data":"5212e35ac0e6006f5472acaa430a56d8174f28732718cdc64d1e32a7c6bafc9e"} Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.073952 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-kj9d8" podStartSLOduration=18.523866393 podStartE2EDuration="28.073928549s" podCreationTimestamp="2026-03-09 09:25:01 +0000 UTC" firstStartedPulling="2026-03-09 09:25:18.609629656 +0000 UTC m=+1083.639830408" lastFinishedPulling="2026-03-09 09:25:28.159691812 +0000 UTC m=+1093.189892564" observedRunningTime="2026-03-09 09:25:29.068767978 +0000 UTC m=+1094.098968730" watchObservedRunningTime="2026-03-09 09:25:29.073928549 +0000 UTC m=+1094.104129301" Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.151843 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-n4gp5"] Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.159865 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-n4gp5"] Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.195106 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-2522t"] Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.202257 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-2522t"] Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.675457 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b1d30a-41a4-4b18-a997-bc273133d3ab" path="/var/lib/kubelet/pods/12b1d30a-41a4-4b18-a997-bc273133d3ab/volumes" Mar 09 09:25:29 crc kubenswrapper[4792]: I0309 09:25:29.676196 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3964aa2-1832-4f88-973b-7cad0595f31f" path="/var/lib/kubelet/pods/a3964aa2-1832-4f88-973b-7cad0595f31f/volumes" Mar 09 09:25:30 crc kubenswrapper[4792]: I0309 09:25:30.066462 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"02a198ef-826d-49cf-a6c5-134da45ad28b","Type":"ContainerStarted","Data":"1c3dfd6c8f7f724e0c6bb59b406f1fe7175317643fdb9c98bd56ad68f0e7361a"} Mar 09 09:25:31 crc kubenswrapper[4792]: I0309 09:25:31.077224 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48","Type":"ContainerStarted","Data":"9ea979f7bd49f386d5b644bb613621aa904ae6500edc103c541d052348ae8cb3"} Mar 09 09:25:31 crc kubenswrapper[4792]: I0309 09:25:31.080106 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gw65t" event={"ID":"2fd40118-2613-4e01-a557-f7fc5f24e07c","Type":"ContainerStarted","Data":"4307824fd83ee5060b2a6c0a5fd1e45c5c9f1f5a51af0f7301dd2274731a86ed"} Mar 09 09:25:32 crc kubenswrapper[4792]: I0309 09:25:32.093162 4792 generic.go:334] "Generic (PLEG): container finished" podID="2fd40118-2613-4e01-a557-f7fc5f24e07c" containerID="4307824fd83ee5060b2a6c0a5fd1e45c5c9f1f5a51af0f7301dd2274731a86ed" exitCode=0 Mar 09 09:25:32 crc kubenswrapper[4792]: I0309 09:25:32.093547 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gw65t" event={"ID":"2fd40118-2613-4e01-a557-f7fc5f24e07c","Type":"ContainerDied","Data":"4307824fd83ee5060b2a6c0a5fd1e45c5c9f1f5a51af0f7301dd2274731a86ed"} Mar 09 09:25:32 crc kubenswrapper[4792]: I0309 09:25:32.101809 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ee86e97-a22c-4089-9ce4-363cb0571173","Type":"ContainerStarted","Data":"dbd17cbb8b429cdcb0b12d986092a0771430752ae9708cfa2b6450eb12120d9f"} Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.152815 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7dd0ce66-42bf-4c00-8e99-3c58defcc87f","Type":"ContainerStarted","Data":"a4b6b1f3dabf727ecb3fb305f0c2810817260cfc3c1e46a2b8a3f867d47d782a"} Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.159404 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"02a198ef-826d-49cf-a6c5-134da45ad28b","Type":"ContainerStarted","Data":"2a28c0a6ec67f7c4cfdc48e35dc515f671328fe64b49e648ed13d7e8ad2084a9"} Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.162227 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b99fdd60-0b01-4b3e-ad0b-0f32f7427f48","Type":"ContainerStarted","Data":"d10bd12da6225f906d0e933688d1b26fb6482794789539e878cf3de36c9968d8"} Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.164600 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d","Type":"ContainerStarted","Data":"18970ff52b5e8d8ebbafed712093856d88a669424e4f1698ec7d17801b83624b"} Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.165165 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.167428 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gw65t" event={"ID":"2fd40118-2613-4e01-a557-f7fc5f24e07c","Type":"ContainerStarted","Data":"9e377750379268e9cb101e9c0bf114bf90b97f013e9e8f2738edcae17d12611b"} Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.169254 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1731fe55-4bf2-4410-85f9-58124ed652c9","Type":"ContainerStarted","Data":"ce431acc6f1c575753094c3f8910d2b41a69810cc62994d40a4dc0697c9b0f97"} Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.213957 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=24.670334852 podStartE2EDuration="31.213942357s" podCreationTimestamp="2026-03-09 09:25:04 +0000 UTC" firstStartedPulling="2026-03-09 09:25:28.081484166 +0000 UTC m=+1093.111684918" lastFinishedPulling="2026-03-09 09:25:34.625091671 +0000 UTC m=+1099.655292423" observedRunningTime="2026-03-09 09:25:35.210123516 +0000 UTC m=+1100.240324278" watchObservedRunningTime="2026-03-09 09:25:35.213942357 +0000 UTC m=+1100.244143109" Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.255897 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=30.76669616 podStartE2EDuration="37.255875945s" podCreationTimestamp="2026-03-09 09:24:58 +0000 UTC" firstStartedPulling="2026-03-09 09:25:28.079018194 +0000 UTC m=+1093.109218946" lastFinishedPulling="2026-03-09 09:25:34.568197979 +0000 UTC m=+1099.598398731" observedRunningTime="2026-03-09 09:25:35.251862328 +0000 UTC m=+1100.282063080" watchObservedRunningTime="2026-03-09 09:25:35.255875945 +0000 UTC m=+1100.286076697" Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.259089 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=26.440695362 podStartE2EDuration="35.259079797s" podCreationTimestamp="2026-03-09 09:25:00 +0000 UTC" firstStartedPulling="2026-03-09 09:25:25.810102504 +0000 UTC m=+1090.840303256" lastFinishedPulling="2026-03-09 09:25:34.628486939 +0000 UTC m=+1099.658687691" observedRunningTime="2026-03-09 09:25:35.236498872 +0000 UTC m=+1100.266699624" watchObservedRunningTime="2026-03-09 09:25:35.259079797 +0000 UTC m=+1100.289280549" Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.290363 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.331315 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.901659 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.902058 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:35 crc kubenswrapper[4792]: I0309 09:25:35.939987 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.181517 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gw65t" event={"ID":"2fd40118-2613-4e01-a557-f7fc5f24e07c","Type":"ContainerStarted","Data":"d04d6c26d39e20d0d67bd91007f33b8b93d5c7751bdbdc4f39ed33abb746d33f"} Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.181560 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.181598 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.193761 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"42b40fb0-d2c9-4bc2-a13f-4c099b244ced","Type":"ContainerStarted","Data":"12ef7a7568725de4169d980eaebeaae0632c46d8f4718c7352b6c167ad607668"} Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.194858 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.210126 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-gw65t" podStartSLOduration=33.187760476 podStartE2EDuration="35.210104637s" podCreationTimestamp="2026-03-09 09:25:01 +0000 UTC" firstStartedPulling="2026-03-09 09:25:28.104023375 +0000 UTC m=+1093.134224127" lastFinishedPulling="2026-03-09 09:25:30.126367536 +0000 UTC m=+1095.156568288" observedRunningTime="2026-03-09 09:25:36.204242647 +0000 UTC m=+1101.234443389" watchObservedRunningTime="2026-03-09 09:25:36.210104637 +0000 UTC m=+1101.240305389" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.287283 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.322312 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.532418 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wbtzh"] Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.631454 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-gkmjw"] Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.632769 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.634812 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-mxwdc"] Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.635984 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.636539 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.642656 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.656049 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-dns-svc\") pod \"dnsmasq-dns-795cf8b45c-gkmjw\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.656116 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bj8l\" (UniqueName: \"kubernetes.io/projected/9b94bbb1-5f6b-40c1-96b1-66a228166d91-kube-api-access-5bj8l\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.656169 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9b94bbb1-5f6b-40c1-96b1-66a228166d91-ovs-rundir\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.656189 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b94bbb1-5f6b-40c1-96b1-66a228166d91-combined-ca-bundle\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.656210 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b94bbb1-5f6b-40c1-96b1-66a228166d91-config\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.656239 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-ovsdbserver-sb\") pod \"dnsmasq-dns-795cf8b45c-gkmjw\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.656262 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2j5f\" (UniqueName: \"kubernetes.io/projected/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-kube-api-access-k2j5f\") pod \"dnsmasq-dns-795cf8b45c-gkmjw\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.656332 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b94bbb1-5f6b-40c1-96b1-66a228166d91-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.656350 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9b94bbb1-5f6b-40c1-96b1-66a228166d91-ovn-rundir\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.656408 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-config\") pod \"dnsmasq-dns-795cf8b45c-gkmjw\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.656554 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mxwdc"] Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.689827 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-gkmjw"] Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.757894 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-config\") pod \"dnsmasq-dns-795cf8b45c-gkmjw\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.757955 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-dns-svc\") pod \"dnsmasq-dns-795cf8b45c-gkmjw\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.757981 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bj8l\" (UniqueName: \"kubernetes.io/projected/9b94bbb1-5f6b-40c1-96b1-66a228166d91-kube-api-access-5bj8l\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.758049 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9b94bbb1-5f6b-40c1-96b1-66a228166d91-ovs-rundir\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.758086 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b94bbb1-5f6b-40c1-96b1-66a228166d91-combined-ca-bundle\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.758111 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b94bbb1-5f6b-40c1-96b1-66a228166d91-config\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.758142 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-ovsdbserver-sb\") pod \"dnsmasq-dns-795cf8b45c-gkmjw\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.758161 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2j5f\" (UniqueName: \"kubernetes.io/projected/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-kube-api-access-k2j5f\") pod \"dnsmasq-dns-795cf8b45c-gkmjw\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.758188 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b94bbb1-5f6b-40c1-96b1-66a228166d91-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.758206 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9b94bbb1-5f6b-40c1-96b1-66a228166d91-ovn-rundir\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.758542 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9b94bbb1-5f6b-40c1-96b1-66a228166d91-ovn-rundir\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.759440 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-config\") pod \"dnsmasq-dns-795cf8b45c-gkmjw\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.760001 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-dns-svc\") pod \"dnsmasq-dns-795cf8b45c-gkmjw\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.768149 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9b94bbb1-5f6b-40c1-96b1-66a228166d91-ovs-rundir\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.768830 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-ovsdbserver-sb\") pod \"dnsmasq-dns-795cf8b45c-gkmjw\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.769766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b94bbb1-5f6b-40c1-96b1-66a228166d91-config\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.782043 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b94bbb1-5f6b-40c1-96b1-66a228166d91-combined-ca-bundle\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.789559 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b94bbb1-5f6b-40c1-96b1-66a228166d91-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.818748 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2j5f\" (UniqueName: \"kubernetes.io/projected/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-kube-api-access-k2j5f\") pod \"dnsmasq-dns-795cf8b45c-gkmjw\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.831695 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bj8l\" (UniqueName: \"kubernetes.io/projected/9b94bbb1-5f6b-40c1-96b1-66a228166d91-kube-api-access-5bj8l\") pod \"ovn-controller-metrics-mxwdc\" (UID: \"9b94bbb1-5f6b-40c1-96b1-66a228166d91\") " pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.951764 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-vvmxj"] Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.981575 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.989519 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 09 09:25:36 crc kubenswrapper[4792]: I0309 09:25:36.991740 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.016356 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mxwdc" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.039672 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.059007 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.063819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b87887-c8d6-4658-9f0e-3d94f414c14c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.063858 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h2c2\" (UniqueName: \"kubernetes.io/projected/58b87887-c8d6-4658-9f0e-3d94f414c14c-kube-api-access-5h2c2\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.063884 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58b87887-c8d6-4658-9f0e-3d94f414c14c-scripts\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.063919 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b87887-c8d6-4658-9f0e-3d94f414c14c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.063950 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/58b87887-c8d6-4658-9f0e-3d94f414c14c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.063986 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58b87887-c8d6-4658-9f0e-3d94f414c14c-config\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.064008 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b87887-c8d6-4658-9f0e-3d94f414c14c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.077421 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.078308 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-svmwb" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.078438 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.165276 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b87887-c8d6-4658-9f0e-3d94f414c14c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.165360 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b87887-c8d6-4658-9f0e-3d94f414c14c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.165382 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h2c2\" (UniqueName: \"kubernetes.io/projected/58b87887-c8d6-4658-9f0e-3d94f414c14c-kube-api-access-5h2c2\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.165410 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58b87887-c8d6-4658-9f0e-3d94f414c14c-scripts\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.165452 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b87887-c8d6-4658-9f0e-3d94f414c14c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.165507 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/58b87887-c8d6-4658-9f0e-3d94f414c14c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.165551 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58b87887-c8d6-4658-9f0e-3d94f414c14c-config\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.166640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58b87887-c8d6-4658-9f0e-3d94f414c14c-config\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.167417 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58b87887-c8d6-4658-9f0e-3d94f414c14c-scripts\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.168275 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/58b87887-c8d6-4658-9f0e-3d94f414c14c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.171481 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b87887-c8d6-4658-9f0e-3d94f414c14c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.179081 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b87887-c8d6-4658-9f0e-3d94f414c14c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.182416 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b87887-c8d6-4658-9f0e-3d94f414c14c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.222671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h2c2\" (UniqueName: \"kubernetes.io/projected/58b87887-c8d6-4658-9f0e-3d94f414c14c-kube-api-access-5h2c2\") pod \"ovn-northd-0\" (UID: \"58b87887-c8d6-4658-9f0e-3d94f414c14c\") " pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.243299 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"22afdfd4-ea58-4efb-b316-bcb40c906952","Type":"ContainerStarted","Data":"a0e526d4be42765f3a9f73eb869c77337d3da22403a3ed7ea563f8dfb9cc2c8f"} Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.243339 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-wjgqq"] Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.245434 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.250335 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.376118 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.376619 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4n97\" (UniqueName: \"kubernetes.io/projected/c980fc97-3e6c-4f70-a130-b0a10bd053e3-kube-api-access-s4n97\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.376743 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.376769 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.376832 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-config\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.396542 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-wjgqq"] Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.400242 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=10.159899882 podStartE2EDuration="41.400227299s" podCreationTimestamp="2026-03-09 09:24:56 +0000 UTC" firstStartedPulling="2026-03-09 09:25:04.89362738 +0000 UTC m=+1069.923828132" lastFinishedPulling="2026-03-09 09:25:36.133954797 +0000 UTC m=+1101.164155549" observedRunningTime="2026-03-09 09:25:37.305495548 +0000 UTC m=+1102.335696310" watchObservedRunningTime="2026-03-09 09:25:37.400227299 +0000 UTC m=+1102.430428051" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.478411 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-config\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.478480 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.478544 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4n97\" (UniqueName: \"kubernetes.io/projected/c980fc97-3e6c-4f70-a130-b0a10bd053e3-kube-api-access-s4n97\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.478588 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.478605 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.479570 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.479581 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.480646 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-config\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.482432 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.483036 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.526008 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4n97\" (UniqueName: \"kubernetes.io/projected/c980fc97-3e6c-4f70-a130-b0a10bd053e3-kube-api-access-s4n97\") pod \"dnsmasq-dns-7b57d9888c-wjgqq\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.562120 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.635949 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.682657 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b750127e-0708-4867-8cba-b31296edae00-config\") pod \"b750127e-0708-4867-8cba-b31296edae00\" (UID: \"b750127e-0708-4867-8cba-b31296edae00\") " Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.682718 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b750127e-0708-4867-8cba-b31296edae00-dns-svc\") pod \"b750127e-0708-4867-8cba-b31296edae00\" (UID: \"b750127e-0708-4867-8cba-b31296edae00\") " Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.682759 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlc8t\" (UniqueName: \"kubernetes.io/projected/b750127e-0708-4867-8cba-b31296edae00-kube-api-access-vlc8t\") pod \"b750127e-0708-4867-8cba-b31296edae00\" (UID: \"b750127e-0708-4867-8cba-b31296edae00\") " Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.685556 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b750127e-0708-4867-8cba-b31296edae00-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b750127e-0708-4867-8cba-b31296edae00" (UID: "b750127e-0708-4867-8cba-b31296edae00"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.686562 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b750127e-0708-4867-8cba-b31296edae00-config" (OuterVolumeSpecName: "config") pod "b750127e-0708-4867-8cba-b31296edae00" (UID: "b750127e-0708-4867-8cba-b31296edae00"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.688584 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b750127e-0708-4867-8cba-b31296edae00-kube-api-access-vlc8t" (OuterVolumeSpecName: "kube-api-access-vlc8t") pod "b750127e-0708-4867-8cba-b31296edae00" (UID: "b750127e-0708-4867-8cba-b31296edae00"). InnerVolumeSpecName "kube-api-access-vlc8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.786907 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b750127e-0708-4867-8cba-b31296edae00-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.786958 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b750127e-0708-4867-8cba-b31296edae00-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.786970 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlc8t\" (UniqueName: \"kubernetes.io/projected/b750127e-0708-4867-8cba-b31296edae00-kube-api-access-vlc8t\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:37 crc kubenswrapper[4792]: I0309 09:25:37.903552 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.043850 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0acc36a0-f564-4683-a8d5-d76348b1cf3f-dns-svc\") pod \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\" (UID: \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\") " Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.043967 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wszxb\" (UniqueName: \"kubernetes.io/projected/0acc36a0-f564-4683-a8d5-d76348b1cf3f-kube-api-access-wszxb\") pod \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\" (UID: \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\") " Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.043983 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0acc36a0-f564-4683-a8d5-d76348b1cf3f-config\") pod \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\" (UID: \"0acc36a0-f564-4683-a8d5-d76348b1cf3f\") " Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.044981 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0acc36a0-f564-4683-a8d5-d76348b1cf3f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0acc36a0-f564-4683-a8d5-d76348b1cf3f" (UID: "0acc36a0-f564-4683-a8d5-d76348b1cf3f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.045603 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0acc36a0-f564-4683-a8d5-d76348b1cf3f-config" (OuterVolumeSpecName: "config") pod "0acc36a0-f564-4683-a8d5-d76348b1cf3f" (UID: "0acc36a0-f564-4683-a8d5-d76348b1cf3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.066479 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0acc36a0-f564-4683-a8d5-d76348b1cf3f-kube-api-access-wszxb" (OuterVolumeSpecName: "kube-api-access-wszxb") pod "0acc36a0-f564-4683-a8d5-d76348b1cf3f" (UID: "0acc36a0-f564-4683-a8d5-d76348b1cf3f"). InnerVolumeSpecName "kube-api-access-wszxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.141952 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mxwdc"] Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.145464 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wszxb\" (UniqueName: \"kubernetes.io/projected/0acc36a0-f564-4683-a8d5-d76348b1cf3f-kube-api-access-wszxb\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.145517 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0acc36a0-f564-4683-a8d5-d76348b1cf3f-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.145527 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0acc36a0-f564-4683-a8d5-d76348b1cf3f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.162012 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-gkmjw"] Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.261286 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" event={"ID":"b750127e-0708-4867-8cba-b31296edae00","Type":"ContainerDied","Data":"dc5d3a28135d82dfe0b6378840df5f0c938f3ea74925957163ae9ddada17a129"} Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.261538 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-wbtzh" Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.271020 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" event={"ID":"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de","Type":"ContainerStarted","Data":"91d22efacee71d6d840e8294e5752980e412414eaaf640ea7702b1e989983f8a"} Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.274897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mxwdc" event={"ID":"9b94bbb1-5f6b-40c1-96b1-66a228166d91","Type":"ContainerStarted","Data":"044875addf4bee0e33d030a095377cf5db2daaefd71be0ff25422bcbe23c7fc5"} Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.276205 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" event={"ID":"0acc36a0-f564-4683-a8d5-d76348b1cf3f","Type":"ContainerDied","Data":"e7d85deacd14b93c3ab238292c1cdd06af0a418eabc2cc6bfad0f4f408a75f98"} Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.276249 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-vvmxj" Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.281533 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.383196 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wbtzh"] Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.406135 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-wbtzh"] Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.436922 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-vvmxj"] Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.457413 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-vvmxj"] Mar 09 09:25:38 crc kubenswrapper[4792]: I0309 09:25:38.492040 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-wjgqq"] Mar 09 09:25:39 crc kubenswrapper[4792]: I0309 09:25:39.287984 4792 generic.go:334] "Generic (PLEG): container finished" podID="c980fc97-3e6c-4f70-a130-b0a10bd053e3" containerID="1a5c0513666f40950180974ad50af5b21979a98a35b4c1230d956eb267c0232e" exitCode=0 Mar 09 09:25:39 crc kubenswrapper[4792]: I0309 09:25:39.288385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" event={"ID":"c980fc97-3e6c-4f70-a130-b0a10bd053e3","Type":"ContainerDied","Data":"1a5c0513666f40950180974ad50af5b21979a98a35b4c1230d956eb267c0232e"} Mar 09 09:25:39 crc kubenswrapper[4792]: I0309 09:25:39.288416 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" event={"ID":"c980fc97-3e6c-4f70-a130-b0a10bd053e3","Type":"ContainerStarted","Data":"8f0d7cadd3ba6560fceb3ad98ce9bf3b3172395fe8af056c31795972c5332ce5"} Mar 09 09:25:39 crc kubenswrapper[4792]: I0309 09:25:39.292771 4792 generic.go:334] "Generic (PLEG): container finished" podID="ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de" containerID="9bd9580be66d68e2e19b6e325d6ee174ff3635f01de04b33d731477f5b15e9b1" exitCode=0 Mar 09 09:25:39 crc kubenswrapper[4792]: I0309 09:25:39.292834 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" event={"ID":"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de","Type":"ContainerDied","Data":"9bd9580be66d68e2e19b6e325d6ee174ff3635f01de04b33d731477f5b15e9b1"} Mar 09 09:25:39 crc kubenswrapper[4792]: I0309 09:25:39.294739 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mxwdc" event={"ID":"9b94bbb1-5f6b-40c1-96b1-66a228166d91","Type":"ContainerStarted","Data":"ae450c7d3635c6d2031805d75d79d91dd5da498bceb5a33187149887d0092468"} Mar 09 09:25:39 crc kubenswrapper[4792]: I0309 09:25:39.296578 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"58b87887-c8d6-4658-9f0e-3d94f414c14c","Type":"ContainerStarted","Data":"259d73d2781532c1e0ae524839dcbd66e7b98343011ae72ad35f4f75c33afe60"} Mar 09 09:25:39 crc kubenswrapper[4792]: I0309 09:25:39.333850 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-mxwdc" podStartSLOduration=3.333832176 podStartE2EDuration="3.333832176s" podCreationTimestamp="2026-03-09 09:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:39.329562192 +0000 UTC m=+1104.359762954" watchObservedRunningTime="2026-03-09 09:25:39.333832176 +0000 UTC m=+1104.364032928" Mar 09 09:25:39 crc kubenswrapper[4792]: I0309 09:25:39.675341 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0acc36a0-f564-4683-a8d5-d76348b1cf3f" path="/var/lib/kubelet/pods/0acc36a0-f564-4683-a8d5-d76348b1cf3f/volumes" Mar 09 09:25:39 crc kubenswrapper[4792]: I0309 09:25:39.675699 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b750127e-0708-4867-8cba-b31296edae00" path="/var/lib/kubelet/pods/b750127e-0708-4867-8cba-b31296edae00/volumes" Mar 09 09:25:40 crc kubenswrapper[4792]: I0309 09:25:40.308331 4792 generic.go:334] "Generic (PLEG): container finished" podID="1731fe55-4bf2-4410-85f9-58124ed652c9" containerID="ce431acc6f1c575753094c3f8910d2b41a69810cc62994d40a4dc0697c9b0f97" exitCode=0 Mar 09 09:25:40 crc kubenswrapper[4792]: I0309 09:25:40.308550 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1731fe55-4bf2-4410-85f9-58124ed652c9","Type":"ContainerDied","Data":"ce431acc6f1c575753094c3f8910d2b41a69810cc62994d40a4dc0697c9b0f97"} Mar 09 09:25:40 crc kubenswrapper[4792]: I0309 09:25:40.310772 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" event={"ID":"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de","Type":"ContainerStarted","Data":"421766db483ea88af46c41ea0b0d12593264b468b700d1ff3330181c9cbc6334"} Mar 09 09:25:40 crc kubenswrapper[4792]: I0309 09:25:40.311398 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:40 crc kubenswrapper[4792]: I0309 09:25:40.321134 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" event={"ID":"c980fc97-3e6c-4f70-a130-b0a10bd053e3","Type":"ContainerStarted","Data":"da89f09900708c8b929bf1df242e9501a75fbfd45831a3736a1c38d34d0dbdac"} Mar 09 09:25:40 crc kubenswrapper[4792]: I0309 09:25:40.321628 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:40 crc kubenswrapper[4792]: I0309 09:25:40.325640 4792 generic.go:334] "Generic (PLEG): container finished" podID="7dd0ce66-42bf-4c00-8e99-3c58defcc87f" containerID="a4b6b1f3dabf727ecb3fb305f0c2810817260cfc3c1e46a2b8a3f867d47d782a" exitCode=0 Mar 09 09:25:40 crc kubenswrapper[4792]: I0309 09:25:40.325789 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7dd0ce66-42bf-4c00-8e99-3c58defcc87f","Type":"ContainerDied","Data":"a4b6b1f3dabf727ecb3fb305f0c2810817260cfc3c1e46a2b8a3f867d47d782a"} Mar 09 09:25:40 crc kubenswrapper[4792]: I0309 09:25:40.362814 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"58b87887-c8d6-4658-9f0e-3d94f414c14c","Type":"ContainerStarted","Data":"c5af40c6e460334a1cdf8bcb60a9b491b932c19a3ddafa0ccb5f1e8aa75e70d2"} Mar 09 09:25:40 crc kubenswrapper[4792]: I0309 09:25:40.413148 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" podStartSLOduration=3.964513385 podStartE2EDuration="4.404243432s" podCreationTimestamp="2026-03-09 09:25:36 +0000 UTC" firstStartedPulling="2026-03-09 09:25:38.153835647 +0000 UTC m=+1103.184036399" lastFinishedPulling="2026-03-09 09:25:38.593565694 +0000 UTC m=+1103.623766446" observedRunningTime="2026-03-09 09:25:40.390329968 +0000 UTC m=+1105.420530730" watchObservedRunningTime="2026-03-09 09:25:40.404243432 +0000 UTC m=+1105.434444184" Mar 09 09:25:40 crc kubenswrapper[4792]: I0309 09:25:40.430543 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" podStartSLOduration=3.020185511 podStartE2EDuration="3.430524014s" podCreationTimestamp="2026-03-09 09:25:37 +0000 UTC" firstStartedPulling="2026-03-09 09:25:38.463500298 +0000 UTC m=+1103.493701050" lastFinishedPulling="2026-03-09 09:25:38.873838801 +0000 UTC m=+1103.904039553" observedRunningTime="2026-03-09 09:25:40.422387529 +0000 UTC m=+1105.452588281" watchObservedRunningTime="2026-03-09 09:25:40.430524014 +0000 UTC m=+1105.460724766" Mar 09 09:25:40 crc kubenswrapper[4792]: E0309 09:25:40.508901 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1731fe55_4bf2_4410_85f9_58124ed652c9.slice/crio-conmon-ce431acc6f1c575753094c3f8910d2b41a69810cc62994d40a4dc0697c9b0f97.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1731fe55_4bf2_4410_85f9_58124ed652c9.slice/crio-ce431acc6f1c575753094c3f8910d2b41a69810cc62994d40a4dc0697c9b0f97.scope\": RecentStats: unable to find data in memory cache]" Mar 09 09:25:41 crc kubenswrapper[4792]: I0309 09:25:41.370380 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1731fe55-4bf2-4410-85f9-58124ed652c9","Type":"ContainerStarted","Data":"5e2ac6924e2d2b8f4541451f9228b27727d4ef02dc9f3d8de937d899557a2c9d"} Mar 09 09:25:41 crc kubenswrapper[4792]: I0309 09:25:41.372291 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7dd0ce66-42bf-4c00-8e99-3c58defcc87f","Type":"ContainerStarted","Data":"e46873415979711a87391d1ec0b40dd1ee25d74b07008d0fd6e1b1274b274b2d"} Mar 09 09:25:41 crc kubenswrapper[4792]: I0309 09:25:41.374407 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"58b87887-c8d6-4658-9f0e-3d94f414c14c","Type":"ContainerStarted","Data":"43eb53ed7ad0b62f4eac94b944878fc8868abc5609ffc7f2457f8fd1fe005dc6"} Mar 09 09:25:41 crc kubenswrapper[4792]: I0309 09:25:41.396659 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371989.458136 podStartE2EDuration="47.396640363s" podCreationTimestamp="2026-03-09 09:24:54 +0000 UTC" firstStartedPulling="2026-03-09 09:24:56.934219753 +0000 UTC m=+1061.964420505" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:41.390294579 +0000 UTC m=+1106.420495351" watchObservedRunningTime="2026-03-09 09:25:41.396640363 +0000 UTC m=+1106.426841115" Mar 09 09:25:41 crc kubenswrapper[4792]: I0309 09:25:41.422349 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.942395929 podStartE2EDuration="47.422331079s" podCreationTimestamp="2026-03-09 09:24:54 +0000 UTC" firstStartedPulling="2026-03-09 09:24:56.174483193 +0000 UTC m=+1061.204683945" lastFinishedPulling="2026-03-09 09:25:34.654418343 +0000 UTC m=+1099.684619095" observedRunningTime="2026-03-09 09:25:41.409994511 +0000 UTC m=+1106.440195263" watchObservedRunningTime="2026-03-09 09:25:41.422331079 +0000 UTC m=+1106.452531831" Mar 09 09:25:41 crc kubenswrapper[4792]: I0309 09:25:41.437953 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.967308416 podStartE2EDuration="5.437931122s" podCreationTimestamp="2026-03-09 09:25:36 +0000 UTC" firstStartedPulling="2026-03-09 09:25:38.291088062 +0000 UTC m=+1103.321288814" lastFinishedPulling="2026-03-09 09:25:39.761710768 +0000 UTC m=+1104.791911520" observedRunningTime="2026-03-09 09:25:41.434648747 +0000 UTC m=+1106.464849499" watchObservedRunningTime="2026-03-09 09:25:41.437931122 +0000 UTC m=+1106.468131874" Mar 09 09:25:41 crc kubenswrapper[4792]: I0309 09:25:41.687303 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 09 09:25:41 crc kubenswrapper[4792]: I0309 09:25:41.689674 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 09 09:25:42 crc kubenswrapper[4792]: I0309 09:25:42.380531 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 09 09:25:45 crc kubenswrapper[4792]: I0309 09:25:45.525121 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 09 09:25:45 crc kubenswrapper[4792]: I0309 09:25:45.525175 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 09 09:25:46 crc kubenswrapper[4792]: I0309 09:25:46.307820 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 09 09:25:46 crc kubenswrapper[4792]: I0309 09:25:46.308597 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 09 09:25:46 crc kubenswrapper[4792]: I0309 09:25:46.984254 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:47 crc kubenswrapper[4792]: I0309 09:25:47.638344 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:25:47 crc kubenswrapper[4792]: I0309 09:25:47.701720 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-gkmjw"] Mar 09 09:25:47 crc kubenswrapper[4792]: I0309 09:25:47.702228 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" podUID="ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de" containerName="dnsmasq-dns" containerID="cri-o://421766db483ea88af46c41ea0b0d12593264b468b700d1ff3330181c9cbc6334" gracePeriod=10 Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.142192 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.251279 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.259933 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.422902 4792 generic.go:334] "Generic (PLEG): container finished" podID="ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de" containerID="421766db483ea88af46c41ea0b0d12593264b468b700d1ff3330181c9cbc6334" exitCode=0 Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.422992 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.422975 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" event={"ID":"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de","Type":"ContainerDied","Data":"421766db483ea88af46c41ea0b0d12593264b468b700d1ff3330181c9cbc6334"} Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.423081 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795cf8b45c-gkmjw" event={"ID":"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de","Type":"ContainerDied","Data":"91d22efacee71d6d840e8294e5752980e412414eaaf640ea7702b1e989983f8a"} Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.423104 4792 scope.go:117] "RemoveContainer" containerID="421766db483ea88af46c41ea0b0d12593264b468b700d1ff3330181c9cbc6334" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.438005 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-dns-svc\") pod \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.438043 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-ovsdbserver-sb\") pod \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.438189 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-config\") pod \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.438332 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2j5f\" (UniqueName: \"kubernetes.io/projected/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-kube-api-access-k2j5f\") pod \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\" (UID: \"ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de\") " Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.442964 4792 scope.go:117] "RemoveContainer" containerID="9bd9580be66d68e2e19b6e325d6ee174ff3635f01de04b33d731477f5b15e9b1" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.446392 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-kube-api-access-k2j5f" (OuterVolumeSpecName: "kube-api-access-k2j5f") pod "ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de" (UID: "ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de"). InnerVolumeSpecName "kube-api-access-k2j5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.487410 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de" (UID: "ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.500104 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de" (UID: "ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.505200 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-config" (OuterVolumeSpecName: "config") pod "ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de" (UID: "ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.531563 4792 scope.go:117] "RemoveContainer" containerID="421766db483ea88af46c41ea0b0d12593264b468b700d1ff3330181c9cbc6334" Mar 09 09:25:48 crc kubenswrapper[4792]: E0309 09:25:48.532285 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"421766db483ea88af46c41ea0b0d12593264b468b700d1ff3330181c9cbc6334\": container with ID starting with 421766db483ea88af46c41ea0b0d12593264b468b700d1ff3330181c9cbc6334 not found: ID does not exist" containerID="421766db483ea88af46c41ea0b0d12593264b468b700d1ff3330181c9cbc6334" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.532344 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"421766db483ea88af46c41ea0b0d12593264b468b700d1ff3330181c9cbc6334"} err="failed to get container status \"421766db483ea88af46c41ea0b0d12593264b468b700d1ff3330181c9cbc6334\": rpc error: code = NotFound desc = could not find container \"421766db483ea88af46c41ea0b0d12593264b468b700d1ff3330181c9cbc6334\": container with ID starting with 421766db483ea88af46c41ea0b0d12593264b468b700d1ff3330181c9cbc6334 not found: ID does not exist" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.532374 4792 scope.go:117] "RemoveContainer" containerID="9bd9580be66d68e2e19b6e325d6ee174ff3635f01de04b33d731477f5b15e9b1" Mar 09 09:25:48 crc kubenswrapper[4792]: E0309 09:25:48.532722 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd9580be66d68e2e19b6e325d6ee174ff3635f01de04b33d731477f5b15e9b1\": container with ID starting with 9bd9580be66d68e2e19b6e325d6ee174ff3635f01de04b33d731477f5b15e9b1 not found: ID does not exist" containerID="9bd9580be66d68e2e19b6e325d6ee174ff3635f01de04b33d731477f5b15e9b1" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.532749 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd9580be66d68e2e19b6e325d6ee174ff3635f01de04b33d731477f5b15e9b1"} err="failed to get container status \"9bd9580be66d68e2e19b6e325d6ee174ff3635f01de04b33d731477f5b15e9b1\": rpc error: code = NotFound desc = could not find container \"9bd9580be66d68e2e19b6e325d6ee174ff3635f01de04b33d731477f5b15e9b1\": container with ID starting with 9bd9580be66d68e2e19b6e325d6ee174ff3635f01de04b33d731477f5b15e9b1 not found: ID does not exist" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.540425 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2j5f\" (UniqueName: \"kubernetes.io/projected/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-kube-api-access-k2j5f\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.540470 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.540484 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.540494 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.759531 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-gkmjw"] Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.766461 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-795cf8b45c-gkmjw"] Mar 09 09:25:48 crc kubenswrapper[4792]: I0309 09:25:48.859031 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 09:25:49 crc kubenswrapper[4792]: I0309 09:25:49.672274 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de" path="/var/lib/kubelet/pods/ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de/volumes" Mar 09 09:25:50 crc kubenswrapper[4792]: I0309 09:25:50.546955 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 09 09:25:50 crc kubenswrapper[4792]: I0309 09:25:50.633523 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 09 09:25:51 crc kubenswrapper[4792]: I0309 09:25:51.938513 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a9dd-account-create-update-4b996"] Mar 09 09:25:51 crc kubenswrapper[4792]: E0309 09:25:51.938830 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de" containerName="dnsmasq-dns" Mar 09 09:25:51 crc kubenswrapper[4792]: I0309 09:25:51.938846 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de" containerName="dnsmasq-dns" Mar 09 09:25:51 crc kubenswrapper[4792]: E0309 09:25:51.938904 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de" containerName="init" Mar 09 09:25:51 crc kubenswrapper[4792]: I0309 09:25:51.938917 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de" containerName="init" Mar 09 09:25:51 crc kubenswrapper[4792]: I0309 09:25:51.939111 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee2b0e26-39f3-4fb1-b1b8-d3b1cd6a27de" containerName="dnsmasq-dns" Mar 09 09:25:51 crc kubenswrapper[4792]: I0309 09:25:51.939686 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a9dd-account-create-update-4b996" Mar 09 09:25:51 crc kubenswrapper[4792]: I0309 09:25:51.942042 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 09 09:25:51 crc kubenswrapper[4792]: I0309 09:25:51.956681 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a9dd-account-create-update-4b996"] Mar 09 09:25:51 crc kubenswrapper[4792]: I0309 09:25:51.983132 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8rmrl"] Mar 09 09:25:51 crc kubenswrapper[4792]: I0309 09:25:51.984328 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8rmrl" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.009428 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8rmrl"] Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.141104 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f020a8c2-5769-444a-96ee-21af21655306-operator-scripts\") pod \"glance-db-create-8rmrl\" (UID: \"f020a8c2-5769-444a-96ee-21af21655306\") " pod="openstack/glance-db-create-8rmrl" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.141173 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8-operator-scripts\") pod \"glance-a9dd-account-create-update-4b996\" (UID: \"d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8\") " pod="openstack/glance-a9dd-account-create-update-4b996" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.141191 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnmps\" (UniqueName: \"kubernetes.io/projected/d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8-kube-api-access-dnmps\") pod \"glance-a9dd-account-create-update-4b996\" (UID: \"d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8\") " pod="openstack/glance-a9dd-account-create-update-4b996" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.141209 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2jpt\" (UniqueName: \"kubernetes.io/projected/f020a8c2-5769-444a-96ee-21af21655306-kube-api-access-w2jpt\") pod \"glance-db-create-8rmrl\" (UID: \"f020a8c2-5769-444a-96ee-21af21655306\") " pod="openstack/glance-db-create-8rmrl" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.242621 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8-operator-scripts\") pod \"glance-a9dd-account-create-update-4b996\" (UID: \"d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8\") " pod="openstack/glance-a9dd-account-create-update-4b996" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.242688 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnmps\" (UniqueName: \"kubernetes.io/projected/d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8-kube-api-access-dnmps\") pod \"glance-a9dd-account-create-update-4b996\" (UID: \"d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8\") " pod="openstack/glance-a9dd-account-create-update-4b996" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.242728 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2jpt\" (UniqueName: \"kubernetes.io/projected/f020a8c2-5769-444a-96ee-21af21655306-kube-api-access-w2jpt\") pod \"glance-db-create-8rmrl\" (UID: \"f020a8c2-5769-444a-96ee-21af21655306\") " pod="openstack/glance-db-create-8rmrl" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.242989 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f020a8c2-5769-444a-96ee-21af21655306-operator-scripts\") pod \"glance-db-create-8rmrl\" (UID: \"f020a8c2-5769-444a-96ee-21af21655306\") " pod="openstack/glance-db-create-8rmrl" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.243426 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8-operator-scripts\") pod \"glance-a9dd-account-create-update-4b996\" (UID: \"d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8\") " pod="openstack/glance-a9dd-account-create-update-4b996" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.243628 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f020a8c2-5769-444a-96ee-21af21655306-operator-scripts\") pod \"glance-db-create-8rmrl\" (UID: \"f020a8c2-5769-444a-96ee-21af21655306\") " pod="openstack/glance-db-create-8rmrl" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.260820 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2jpt\" (UniqueName: \"kubernetes.io/projected/f020a8c2-5769-444a-96ee-21af21655306-kube-api-access-w2jpt\") pod \"glance-db-create-8rmrl\" (UID: \"f020a8c2-5769-444a-96ee-21af21655306\") " pod="openstack/glance-db-create-8rmrl" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.275558 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnmps\" (UniqueName: \"kubernetes.io/projected/d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8-kube-api-access-dnmps\") pod \"glance-a9dd-account-create-update-4b996\" (UID: \"d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8\") " pod="openstack/glance-a9dd-account-create-update-4b996" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.306539 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8rmrl" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.563437 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a9dd-account-create-update-4b996" Mar 09 09:25:52 crc kubenswrapper[4792]: I0309 09:25:52.770910 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8rmrl"] Mar 09 09:25:53 crc kubenswrapper[4792]: I0309 09:25:53.035601 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a9dd-account-create-update-4b996"] Mar 09 09:25:53 crc kubenswrapper[4792]: I0309 09:25:53.460199 4792 generic.go:334] "Generic (PLEG): container finished" podID="f020a8c2-5769-444a-96ee-21af21655306" containerID="434f86a5bfe02916bbcdda6126e7faef064706b34062cc52af17a905e7d4557c" exitCode=0 Mar 09 09:25:53 crc kubenswrapper[4792]: I0309 09:25:53.460323 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8rmrl" event={"ID":"f020a8c2-5769-444a-96ee-21af21655306","Type":"ContainerDied","Data":"434f86a5bfe02916bbcdda6126e7faef064706b34062cc52af17a905e7d4557c"} Mar 09 09:25:53 crc kubenswrapper[4792]: I0309 09:25:53.460567 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8rmrl" event={"ID":"f020a8c2-5769-444a-96ee-21af21655306","Type":"ContainerStarted","Data":"71b428d8192066ae74c60c0e22ff8e74d216b211dd3516ddbf0657dd0e6b919c"} Mar 09 09:25:53 crc kubenswrapper[4792]: I0309 09:25:53.462722 4792 generic.go:334] "Generic (PLEG): container finished" podID="d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8" containerID="5fd8b27c01a11def787b3e7d0415a5f4c7b3f3fe4c170c7692b8cac930f3e052" exitCode=0 Mar 09 09:25:53 crc kubenswrapper[4792]: I0309 09:25:53.462787 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a9dd-account-create-update-4b996" event={"ID":"d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8","Type":"ContainerDied","Data":"5fd8b27c01a11def787b3e7d0415a5f4c7b3f3fe4c170c7692b8cac930f3e052"} Mar 09 09:25:53 crc kubenswrapper[4792]: I0309 09:25:53.462985 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a9dd-account-create-update-4b996" event={"ID":"d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8","Type":"ContainerStarted","Data":"8025735ad8958b4d089225451d50e20d71efa345f2f623b7de67b80aac15b380"} Mar 09 09:25:53 crc kubenswrapper[4792]: I0309 09:25:53.991912 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-75plx"] Mar 09 09:25:53 crc kubenswrapper[4792]: I0309 09:25:53.993842 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-75plx" Mar 09 09:25:53 crc kubenswrapper[4792]: I0309 09:25:53.998276 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 09 09:25:54 crc kubenswrapper[4792]: I0309 09:25:54.032807 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-75plx"] Mar 09 09:25:54 crc kubenswrapper[4792]: I0309 09:25:54.177877 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rz9p\" (UniqueName: \"kubernetes.io/projected/6e714f5e-5cca-4397-8a64-244ecfad9907-kube-api-access-4rz9p\") pod \"root-account-create-update-75plx\" (UID: \"6e714f5e-5cca-4397-8a64-244ecfad9907\") " pod="openstack/root-account-create-update-75plx" Mar 09 09:25:54 crc kubenswrapper[4792]: I0309 09:25:54.178094 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e714f5e-5cca-4397-8a64-244ecfad9907-operator-scripts\") pod \"root-account-create-update-75plx\" (UID: \"6e714f5e-5cca-4397-8a64-244ecfad9907\") " pod="openstack/root-account-create-update-75plx" Mar 09 09:25:54 crc kubenswrapper[4792]: I0309 09:25:54.279523 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rz9p\" (UniqueName: \"kubernetes.io/projected/6e714f5e-5cca-4397-8a64-244ecfad9907-kube-api-access-4rz9p\") pod \"root-account-create-update-75plx\" (UID: \"6e714f5e-5cca-4397-8a64-244ecfad9907\") " pod="openstack/root-account-create-update-75plx" Mar 09 09:25:54 crc kubenswrapper[4792]: I0309 09:25:54.279651 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e714f5e-5cca-4397-8a64-244ecfad9907-operator-scripts\") pod \"root-account-create-update-75plx\" (UID: \"6e714f5e-5cca-4397-8a64-244ecfad9907\") " pod="openstack/root-account-create-update-75plx" Mar 09 09:25:54 crc kubenswrapper[4792]: I0309 09:25:54.282416 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e714f5e-5cca-4397-8a64-244ecfad9907-operator-scripts\") pod \"root-account-create-update-75plx\" (UID: \"6e714f5e-5cca-4397-8a64-244ecfad9907\") " pod="openstack/root-account-create-update-75plx" Mar 09 09:25:54 crc kubenswrapper[4792]: I0309 09:25:54.308666 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rz9p\" (UniqueName: \"kubernetes.io/projected/6e714f5e-5cca-4397-8a64-244ecfad9907-kube-api-access-4rz9p\") pod \"root-account-create-update-75plx\" (UID: \"6e714f5e-5cca-4397-8a64-244ecfad9907\") " pod="openstack/root-account-create-update-75plx" Mar 09 09:25:54 crc kubenswrapper[4792]: I0309 09:25:54.311479 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-75plx" Mar 09 09:25:54 crc kubenswrapper[4792]: I0309 09:25:54.758835 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-75plx"] Mar 09 09:25:54 crc kubenswrapper[4792]: W0309 09:25:54.791232 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e714f5e_5cca_4397_8a64_244ecfad9907.slice/crio-540eb0c104fb69f2bc23448e544742d0814880fc557770a7c734c2448d743623 WatchSource:0}: Error finding container 540eb0c104fb69f2bc23448e544742d0814880fc557770a7c734c2448d743623: Status 404 returned error can't find the container with id 540eb0c104fb69f2bc23448e544742d0814880fc557770a7c734c2448d743623 Mar 09 09:25:54 crc kubenswrapper[4792]: I0309 09:25:54.839629 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8rmrl" Mar 09 09:25:54 crc kubenswrapper[4792]: I0309 09:25:54.922298 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a9dd-account-create-update-4b996" Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.000419 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2jpt\" (UniqueName: \"kubernetes.io/projected/f020a8c2-5769-444a-96ee-21af21655306-kube-api-access-w2jpt\") pod \"f020a8c2-5769-444a-96ee-21af21655306\" (UID: \"f020a8c2-5769-444a-96ee-21af21655306\") " Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.000617 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f020a8c2-5769-444a-96ee-21af21655306-operator-scripts\") pod \"f020a8c2-5769-444a-96ee-21af21655306\" (UID: \"f020a8c2-5769-444a-96ee-21af21655306\") " Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.001846 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f020a8c2-5769-444a-96ee-21af21655306-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f020a8c2-5769-444a-96ee-21af21655306" (UID: "f020a8c2-5769-444a-96ee-21af21655306"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.009992 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f020a8c2-5769-444a-96ee-21af21655306-kube-api-access-w2jpt" (OuterVolumeSpecName: "kube-api-access-w2jpt") pod "f020a8c2-5769-444a-96ee-21af21655306" (UID: "f020a8c2-5769-444a-96ee-21af21655306"). InnerVolumeSpecName "kube-api-access-w2jpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.102211 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnmps\" (UniqueName: \"kubernetes.io/projected/d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8-kube-api-access-dnmps\") pod \"d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8\" (UID: \"d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8\") " Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.102352 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8-operator-scripts\") pod \"d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8\" (UID: \"d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8\") " Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.102782 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f020a8c2-5769-444a-96ee-21af21655306-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.102802 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2jpt\" (UniqueName: \"kubernetes.io/projected/f020a8c2-5769-444a-96ee-21af21655306-kube-api-access-w2jpt\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.102828 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8" (UID: "d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.105300 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8-kube-api-access-dnmps" (OuterVolumeSpecName: "kube-api-access-dnmps") pod "d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8" (UID: "d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8"). InnerVolumeSpecName "kube-api-access-dnmps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.204566 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnmps\" (UniqueName: \"kubernetes.io/projected/d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8-kube-api-access-dnmps\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.204611 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.478283 4792 generic.go:334] "Generic (PLEG): container finished" podID="6e714f5e-5cca-4397-8a64-244ecfad9907" containerID="62d10d444eeac48b6552bf53278c005981de8c2ea7b744b584cf6191830ab056" exitCode=0 Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.478390 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-75plx" event={"ID":"6e714f5e-5cca-4397-8a64-244ecfad9907","Type":"ContainerDied","Data":"62d10d444eeac48b6552bf53278c005981de8c2ea7b744b584cf6191830ab056"} Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.478699 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-75plx" event={"ID":"6e714f5e-5cca-4397-8a64-244ecfad9907","Type":"ContainerStarted","Data":"540eb0c104fb69f2bc23448e544742d0814880fc557770a7c734c2448d743623"} Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.480537 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a9dd-account-create-update-4b996" event={"ID":"d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8","Type":"ContainerDied","Data":"8025735ad8958b4d089225451d50e20d71efa345f2f623b7de67b80aac15b380"} Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.480585 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8025735ad8958b4d089225451d50e20d71efa345f2f623b7de67b80aac15b380" Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.480548 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a9dd-account-create-update-4b996" Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.481986 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8rmrl" event={"ID":"f020a8c2-5769-444a-96ee-21af21655306","Type":"ContainerDied","Data":"71b428d8192066ae74c60c0e22ff8e74d216b211dd3516ddbf0657dd0e6b919c"} Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.482016 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8rmrl" Mar 09 09:25:55 crc kubenswrapper[4792]: I0309 09:25:55.482025 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71b428d8192066ae74c60c0e22ff8e74d216b211dd3516ddbf0657dd0e6b919c" Mar 09 09:25:56 crc kubenswrapper[4792]: I0309 09:25:56.804050 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-75plx" Mar 09 09:25:56 crc kubenswrapper[4792]: I0309 09:25:56.830825 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rz9p\" (UniqueName: \"kubernetes.io/projected/6e714f5e-5cca-4397-8a64-244ecfad9907-kube-api-access-4rz9p\") pod \"6e714f5e-5cca-4397-8a64-244ecfad9907\" (UID: \"6e714f5e-5cca-4397-8a64-244ecfad9907\") " Mar 09 09:25:56 crc kubenswrapper[4792]: I0309 09:25:56.830905 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e714f5e-5cca-4397-8a64-244ecfad9907-operator-scripts\") pod \"6e714f5e-5cca-4397-8a64-244ecfad9907\" (UID: \"6e714f5e-5cca-4397-8a64-244ecfad9907\") " Mar 09 09:25:56 crc kubenswrapper[4792]: I0309 09:25:56.831864 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e714f5e-5cca-4397-8a64-244ecfad9907-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e714f5e-5cca-4397-8a64-244ecfad9907" (UID: "6e714f5e-5cca-4397-8a64-244ecfad9907"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:56 crc kubenswrapper[4792]: I0309 09:25:56.832054 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e714f5e-5cca-4397-8a64-244ecfad9907-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:56 crc kubenswrapper[4792]: I0309 09:25:56.836920 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e714f5e-5cca-4397-8a64-244ecfad9907-kube-api-access-4rz9p" (OuterVolumeSpecName: "kube-api-access-4rz9p") pod "6e714f5e-5cca-4397-8a64-244ecfad9907" (UID: "6e714f5e-5cca-4397-8a64-244ecfad9907"). InnerVolumeSpecName "kube-api-access-4rz9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:56 crc kubenswrapper[4792]: I0309 09:25:56.933966 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rz9p\" (UniqueName: \"kubernetes.io/projected/6e714f5e-5cca-4397-8a64-244ecfad9907-kube-api-access-4rz9p\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.183378 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-tknqj"] Mar 09 09:25:57 crc kubenswrapper[4792]: E0309 09:25:57.184507 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8" containerName="mariadb-account-create-update" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.184607 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8" containerName="mariadb-account-create-update" Mar 09 09:25:57 crc kubenswrapper[4792]: E0309 09:25:57.184707 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f020a8c2-5769-444a-96ee-21af21655306" containerName="mariadb-database-create" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.184762 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f020a8c2-5769-444a-96ee-21af21655306" containerName="mariadb-database-create" Mar 09 09:25:57 crc kubenswrapper[4792]: E0309 09:25:57.184819 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e714f5e-5cca-4397-8a64-244ecfad9907" containerName="mariadb-account-create-update" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.184883 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e714f5e-5cca-4397-8a64-244ecfad9907" containerName="mariadb-account-create-update" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.185079 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f020a8c2-5769-444a-96ee-21af21655306" containerName="mariadb-database-create" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.185161 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e714f5e-5cca-4397-8a64-244ecfad9907" containerName="mariadb-account-create-update" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.185231 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8" containerName="mariadb-account-create-update" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.185749 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.188212 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.188620 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-t8kmn" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.211021 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tknqj"] Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.340630 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-config-data\") pod \"glance-db-sync-tknqj\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.340764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjw6p\" (UniqueName: \"kubernetes.io/projected/efc711fe-a152-4902-af60-09a6bed9344a-kube-api-access-wjw6p\") pod \"glance-db-sync-tknqj\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.340877 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-db-sync-config-data\") pod \"glance-db-sync-tknqj\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.340962 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-combined-ca-bundle\") pod \"glance-db-sync-tknqj\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.444106 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-config-data\") pod \"glance-db-sync-tknqj\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.444166 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjw6p\" (UniqueName: \"kubernetes.io/projected/efc711fe-a152-4902-af60-09a6bed9344a-kube-api-access-wjw6p\") pod \"glance-db-sync-tknqj\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.444209 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-db-sync-config-data\") pod \"glance-db-sync-tknqj\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.444239 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-combined-ca-bundle\") pod \"glance-db-sync-tknqj\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.450832 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-config-data\") pod \"glance-db-sync-tknqj\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.450834 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-db-sync-config-data\") pod \"glance-db-sync-tknqj\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.464153 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-combined-ca-bundle\") pod \"glance-db-sync-tknqj\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.468863 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjw6p\" (UniqueName: \"kubernetes.io/projected/efc711fe-a152-4902-af60-09a6bed9344a-kube-api-access-wjw6p\") pod \"glance-db-sync-tknqj\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.497632 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-75plx" event={"ID":"6e714f5e-5cca-4397-8a64-244ecfad9907","Type":"ContainerDied","Data":"540eb0c104fb69f2bc23448e544742d0814880fc557770a7c734c2448d743623"} Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.497671 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="540eb0c104fb69f2bc23448e544742d0814880fc557770a7c734c2448d743623" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.497738 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-75plx" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.524107 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tknqj" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.547185 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.637147 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-s8k8r"] Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.644222 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s8k8r" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.650233 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s8k8r"] Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.737700 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-eaf0-account-create-update-b2b57"] Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.738705 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-eaf0-account-create-update-b2b57" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.740780 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.748456 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-eaf0-account-create-update-b2b57"] Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.751449 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f813716-691b-4f49-bbb6-e0486b2d2b31-operator-scripts\") pod \"keystone-db-create-s8k8r\" (UID: \"0f813716-691b-4f49-bbb6-e0486b2d2b31\") " pod="openstack/keystone-db-create-s8k8r" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.751524 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2mg\" (UniqueName: \"kubernetes.io/projected/0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144-kube-api-access-5s2mg\") pod \"keystone-eaf0-account-create-update-b2b57\" (UID: \"0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144\") " pod="openstack/keystone-eaf0-account-create-update-b2b57" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.751551 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdrgq\" (UniqueName: \"kubernetes.io/projected/0f813716-691b-4f49-bbb6-e0486b2d2b31-kube-api-access-rdrgq\") pod \"keystone-db-create-s8k8r\" (UID: \"0f813716-691b-4f49-bbb6-e0486b2d2b31\") " pod="openstack/keystone-db-create-s8k8r" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.751612 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144-operator-scripts\") pod \"keystone-eaf0-account-create-update-b2b57\" (UID: \"0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144\") " pod="openstack/keystone-eaf0-account-create-update-b2b57" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.852509 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f813716-691b-4f49-bbb6-e0486b2d2b31-operator-scripts\") pod \"keystone-db-create-s8k8r\" (UID: \"0f813716-691b-4f49-bbb6-e0486b2d2b31\") " pod="openstack/keystone-db-create-s8k8r" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.852812 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2mg\" (UniqueName: \"kubernetes.io/projected/0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144-kube-api-access-5s2mg\") pod \"keystone-eaf0-account-create-update-b2b57\" (UID: \"0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144\") " pod="openstack/keystone-eaf0-account-create-update-b2b57" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.852831 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdrgq\" (UniqueName: \"kubernetes.io/projected/0f813716-691b-4f49-bbb6-e0486b2d2b31-kube-api-access-rdrgq\") pod \"keystone-db-create-s8k8r\" (UID: \"0f813716-691b-4f49-bbb6-e0486b2d2b31\") " pod="openstack/keystone-db-create-s8k8r" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.852879 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144-operator-scripts\") pod \"keystone-eaf0-account-create-update-b2b57\" (UID: \"0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144\") " pod="openstack/keystone-eaf0-account-create-update-b2b57" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.853438 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f813716-691b-4f49-bbb6-e0486b2d2b31-operator-scripts\") pod \"keystone-db-create-s8k8r\" (UID: \"0f813716-691b-4f49-bbb6-e0486b2d2b31\") " pod="openstack/keystone-db-create-s8k8r" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.853645 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144-operator-scripts\") pod \"keystone-eaf0-account-create-update-b2b57\" (UID: \"0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144\") " pod="openstack/keystone-eaf0-account-create-update-b2b57" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.859374 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-zzfgc"] Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.860935 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzfgc" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.877475 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdrgq\" (UniqueName: \"kubernetes.io/projected/0f813716-691b-4f49-bbb6-e0486b2d2b31-kube-api-access-rdrgq\") pod \"keystone-db-create-s8k8r\" (UID: \"0f813716-691b-4f49-bbb6-e0486b2d2b31\") " pod="openstack/keystone-db-create-s8k8r" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.878959 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zzfgc"] Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.899874 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2mg\" (UniqueName: \"kubernetes.io/projected/0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144-kube-api-access-5s2mg\") pod \"keystone-eaf0-account-create-update-b2b57\" (UID: \"0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144\") " pod="openstack/keystone-eaf0-account-create-update-b2b57" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.945908 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-71f9-account-create-update-5wltv"] Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.946922 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-71f9-account-create-update-5wltv" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.948698 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.952751 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-71f9-account-create-update-5wltv"] Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.954890 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/856fa1f5-ef4a-4122-9a28-cabfe353eaeb-operator-scripts\") pod \"placement-db-create-zzfgc\" (UID: \"856fa1f5-ef4a-4122-9a28-cabfe353eaeb\") " pod="openstack/placement-db-create-zzfgc" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.955054 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbjpp\" (UniqueName: \"kubernetes.io/projected/856fa1f5-ef4a-4122-9a28-cabfe353eaeb-kube-api-access-tbjpp\") pod \"placement-db-create-zzfgc\" (UID: \"856fa1f5-ef4a-4122-9a28-cabfe353eaeb\") " pod="openstack/placement-db-create-zzfgc" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.955214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd58f0df-aee5-44be-ae2a-33033696a043-operator-scripts\") pod \"placement-71f9-account-create-update-5wltv\" (UID: \"cd58f0df-aee5-44be-ae2a-33033696a043\") " pod="openstack/placement-71f9-account-create-update-5wltv" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.955304 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnbdq\" (UniqueName: \"kubernetes.io/projected/cd58f0df-aee5-44be-ae2a-33033696a043-kube-api-access-vnbdq\") pod \"placement-71f9-account-create-update-5wltv\" (UID: \"cd58f0df-aee5-44be-ae2a-33033696a043\") " pod="openstack/placement-71f9-account-create-update-5wltv" Mar 09 09:25:57 crc kubenswrapper[4792]: I0309 09:25:57.966019 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s8k8r" Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.056625 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbjpp\" (UniqueName: \"kubernetes.io/projected/856fa1f5-ef4a-4122-9a28-cabfe353eaeb-kube-api-access-tbjpp\") pod \"placement-db-create-zzfgc\" (UID: \"856fa1f5-ef4a-4122-9a28-cabfe353eaeb\") " pod="openstack/placement-db-create-zzfgc" Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.056696 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd58f0df-aee5-44be-ae2a-33033696a043-operator-scripts\") pod \"placement-71f9-account-create-update-5wltv\" (UID: \"cd58f0df-aee5-44be-ae2a-33033696a043\") " pod="openstack/placement-71f9-account-create-update-5wltv" Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.056731 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnbdq\" (UniqueName: \"kubernetes.io/projected/cd58f0df-aee5-44be-ae2a-33033696a043-kube-api-access-vnbdq\") pod \"placement-71f9-account-create-update-5wltv\" (UID: \"cd58f0df-aee5-44be-ae2a-33033696a043\") " pod="openstack/placement-71f9-account-create-update-5wltv" Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.056788 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/856fa1f5-ef4a-4122-9a28-cabfe353eaeb-operator-scripts\") pod \"placement-db-create-zzfgc\" (UID: \"856fa1f5-ef4a-4122-9a28-cabfe353eaeb\") " pod="openstack/placement-db-create-zzfgc" Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.057606 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/856fa1f5-ef4a-4122-9a28-cabfe353eaeb-operator-scripts\") pod \"placement-db-create-zzfgc\" (UID: \"856fa1f5-ef4a-4122-9a28-cabfe353eaeb\") " pod="openstack/placement-db-create-zzfgc" Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.058404 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd58f0df-aee5-44be-ae2a-33033696a043-operator-scripts\") pod \"placement-71f9-account-create-update-5wltv\" (UID: \"cd58f0df-aee5-44be-ae2a-33033696a043\") " pod="openstack/placement-71f9-account-create-update-5wltv" Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.081287 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnbdq\" (UniqueName: \"kubernetes.io/projected/cd58f0df-aee5-44be-ae2a-33033696a043-kube-api-access-vnbdq\") pod \"placement-71f9-account-create-update-5wltv\" (UID: \"cd58f0df-aee5-44be-ae2a-33033696a043\") " pod="openstack/placement-71f9-account-create-update-5wltv" Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.087181 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-eaf0-account-create-update-b2b57" Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.095953 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbjpp\" (UniqueName: \"kubernetes.io/projected/856fa1f5-ef4a-4122-9a28-cabfe353eaeb-kube-api-access-tbjpp\") pod \"placement-db-create-zzfgc\" (UID: \"856fa1f5-ef4a-4122-9a28-cabfe353eaeb\") " pod="openstack/placement-db-create-zzfgc" Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.202975 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-tknqj"] Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.211785 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s8k8r"] Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.219951 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzfgc" Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.267027 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-71f9-account-create-update-5wltv" Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.507271 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s8k8r" event={"ID":"0f813716-691b-4f49-bbb6-e0486b2d2b31","Type":"ContainerStarted","Data":"e690e01c1bda4fa80fa320895f2ddbaa3a88f6ac109e0fae86eab32e4b37c482"} Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.507571 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s8k8r" event={"ID":"0f813716-691b-4f49-bbb6-e0486b2d2b31","Type":"ContainerStarted","Data":"678c961893220ddaeed11eb63646dccd9603b03907713dc83759f2e8d5ef6f92"} Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.510360 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tknqj" event={"ID":"efc711fe-a152-4902-af60-09a6bed9344a","Type":"ContainerStarted","Data":"5d945ad85f932e4efa5ef1bf405013d1afd8579f0a5d3648277e5a832c91fd79"} Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.531859 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-s8k8r" podStartSLOduration=1.531840103 podStartE2EDuration="1.531840103s" podCreationTimestamp="2026-03-09 09:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:58.529206146 +0000 UTC m=+1123.559406898" watchObservedRunningTime="2026-03-09 09:25:58.531840103 +0000 UTC m=+1123.562040855" Mar 09 09:25:58 crc kubenswrapper[4792]: W0309 09:25:58.557119 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e9e2cd8_39ac_46f3_aef1_3c4a81b4a144.slice/crio-5b617ab93ed22b8d94a0ee960a0c716284f424edba73fb6ccc85378593b6696c WatchSource:0}: Error finding container 5b617ab93ed22b8d94a0ee960a0c716284f424edba73fb6ccc85378593b6696c: Status 404 returned error can't find the container with id 5b617ab93ed22b8d94a0ee960a0c716284f424edba73fb6ccc85378593b6696c Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.572860 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-eaf0-account-create-update-b2b57"] Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.598225 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-71f9-account-create-update-5wltv"] Mar 09 09:25:58 crc kubenswrapper[4792]: I0309 09:25:58.720973 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zzfgc"] Mar 09 09:25:58 crc kubenswrapper[4792]: W0309 09:25:58.750877 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod856fa1f5_ef4a_4122_9a28_cabfe353eaeb.slice/crio-9904115b11e246e3767df40133fe82c683f61d230e424780bf2a13e620c3030d WatchSource:0}: Error finding container 9904115b11e246e3767df40133fe82c683f61d230e424780bf2a13e620c3030d: Status 404 returned error can't find the container with id 9904115b11e246e3767df40133fe82c683f61d230e424780bf2a13e620c3030d Mar 09 09:25:59 crc kubenswrapper[4792]: I0309 09:25:59.522811 4792 generic.go:334] "Generic (PLEG): container finished" podID="cd58f0df-aee5-44be-ae2a-33033696a043" containerID="1faa98db718e8b2c1afb8d6384f790c449345fd1741e2bfac48a559c6b9e1c5a" exitCode=0 Mar 09 09:25:59 crc kubenswrapper[4792]: I0309 09:25:59.522905 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-71f9-account-create-update-5wltv" event={"ID":"cd58f0df-aee5-44be-ae2a-33033696a043","Type":"ContainerDied","Data":"1faa98db718e8b2c1afb8d6384f790c449345fd1741e2bfac48a559c6b9e1c5a"} Mar 09 09:25:59 crc kubenswrapper[4792]: I0309 09:25:59.522932 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-71f9-account-create-update-5wltv" event={"ID":"cd58f0df-aee5-44be-ae2a-33033696a043","Type":"ContainerStarted","Data":"c9c1c13e07f0fb729210f4772760849dd828a08132c5915e665c752e9bf875d8"} Mar 09 09:25:59 crc kubenswrapper[4792]: I0309 09:25:59.527778 4792 generic.go:334] "Generic (PLEG): container finished" podID="0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144" containerID="4924ec075293347d1f95edfef61d222d2858972f95097e529d02087fead3cdb3" exitCode=0 Mar 09 09:25:59 crc kubenswrapper[4792]: I0309 09:25:59.527842 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-eaf0-account-create-update-b2b57" event={"ID":"0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144","Type":"ContainerDied","Data":"4924ec075293347d1f95edfef61d222d2858972f95097e529d02087fead3cdb3"} Mar 09 09:25:59 crc kubenswrapper[4792]: I0309 09:25:59.527894 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-eaf0-account-create-update-b2b57" event={"ID":"0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144","Type":"ContainerStarted","Data":"5b617ab93ed22b8d94a0ee960a0c716284f424edba73fb6ccc85378593b6696c"} Mar 09 09:25:59 crc kubenswrapper[4792]: I0309 09:25:59.529318 4792 generic.go:334] "Generic (PLEG): container finished" podID="856fa1f5-ef4a-4122-9a28-cabfe353eaeb" containerID="0aad0fe7657f7fc5d17dac4f6c6fbf72129ee4b6a2e989da744288b54468e4ba" exitCode=0 Mar 09 09:25:59 crc kubenswrapper[4792]: I0309 09:25:59.529388 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzfgc" event={"ID":"856fa1f5-ef4a-4122-9a28-cabfe353eaeb","Type":"ContainerDied","Data":"0aad0fe7657f7fc5d17dac4f6c6fbf72129ee4b6a2e989da744288b54468e4ba"} Mar 09 09:25:59 crc kubenswrapper[4792]: I0309 09:25:59.529409 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzfgc" event={"ID":"856fa1f5-ef4a-4122-9a28-cabfe353eaeb","Type":"ContainerStarted","Data":"9904115b11e246e3767df40133fe82c683f61d230e424780bf2a13e620c3030d"} Mar 09 09:25:59 crc kubenswrapper[4792]: I0309 09:25:59.531511 4792 generic.go:334] "Generic (PLEG): container finished" podID="0f813716-691b-4f49-bbb6-e0486b2d2b31" containerID="e690e01c1bda4fa80fa320895f2ddbaa3a88f6ac109e0fae86eab32e4b37c482" exitCode=0 Mar 09 09:25:59 crc kubenswrapper[4792]: I0309 09:25:59.531545 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s8k8r" event={"ID":"0f813716-691b-4f49-bbb6-e0486b2d2b31","Type":"ContainerDied","Data":"e690e01c1bda4fa80fa320895f2ddbaa3a88f6ac109e0fae86eab32e4b37c482"} Mar 09 09:25:59 crc kubenswrapper[4792]: I0309 09:25:59.918471 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-75plx"] Mar 09 09:25:59 crc kubenswrapper[4792]: I0309 09:25:59.924567 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-75plx"] Mar 09 09:26:00 crc kubenswrapper[4792]: I0309 09:26:00.131327 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550806-qqlhx"] Mar 09 09:26:00 crc kubenswrapper[4792]: I0309 09:26:00.132398 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550806-qqlhx" Mar 09 09:26:00 crc kubenswrapper[4792]: I0309 09:26:00.135640 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:26:00 crc kubenswrapper[4792]: I0309 09:26:00.135835 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:26:00 crc kubenswrapper[4792]: I0309 09:26:00.135962 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:26:00 crc kubenswrapper[4792]: I0309 09:26:00.139613 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550806-qqlhx"] Mar 09 09:26:00 crc kubenswrapper[4792]: I0309 09:26:00.194696 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k58zf\" (UniqueName: \"kubernetes.io/projected/9dcbe3d4-bd2e-4a4b-864a-77b49c015354-kube-api-access-k58zf\") pod \"auto-csr-approver-29550806-qqlhx\" (UID: \"9dcbe3d4-bd2e-4a4b-864a-77b49c015354\") " pod="openshift-infra/auto-csr-approver-29550806-qqlhx" Mar 09 09:26:00 crc kubenswrapper[4792]: I0309 09:26:00.296430 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k58zf\" (UniqueName: \"kubernetes.io/projected/9dcbe3d4-bd2e-4a4b-864a-77b49c015354-kube-api-access-k58zf\") pod \"auto-csr-approver-29550806-qqlhx\" (UID: \"9dcbe3d4-bd2e-4a4b-864a-77b49c015354\") " pod="openshift-infra/auto-csr-approver-29550806-qqlhx" Mar 09 09:26:00 crc kubenswrapper[4792]: I0309 09:26:00.321572 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k58zf\" (UniqueName: \"kubernetes.io/projected/9dcbe3d4-bd2e-4a4b-864a-77b49c015354-kube-api-access-k58zf\") pod \"auto-csr-approver-29550806-qqlhx\" (UID: \"9dcbe3d4-bd2e-4a4b-864a-77b49c015354\") " pod="openshift-infra/auto-csr-approver-29550806-qqlhx" Mar 09 09:26:00 crc kubenswrapper[4792]: I0309 09:26:00.454641 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550806-qqlhx" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.040497 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s8k8r" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.047800 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzfgc" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.115992 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdrgq\" (UniqueName: \"kubernetes.io/projected/0f813716-691b-4f49-bbb6-e0486b2d2b31-kube-api-access-rdrgq\") pod \"0f813716-691b-4f49-bbb6-e0486b2d2b31\" (UID: \"0f813716-691b-4f49-bbb6-e0486b2d2b31\") " Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.116103 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f813716-691b-4f49-bbb6-e0486b2d2b31-operator-scripts\") pod \"0f813716-691b-4f49-bbb6-e0486b2d2b31\" (UID: \"0f813716-691b-4f49-bbb6-e0486b2d2b31\") " Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.116185 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/856fa1f5-ef4a-4122-9a28-cabfe353eaeb-operator-scripts\") pod \"856fa1f5-ef4a-4122-9a28-cabfe353eaeb\" (UID: \"856fa1f5-ef4a-4122-9a28-cabfe353eaeb\") " Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.116246 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbjpp\" (UniqueName: \"kubernetes.io/projected/856fa1f5-ef4a-4122-9a28-cabfe353eaeb-kube-api-access-tbjpp\") pod \"856fa1f5-ef4a-4122-9a28-cabfe353eaeb\" (UID: \"856fa1f5-ef4a-4122-9a28-cabfe353eaeb\") " Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.125371 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f813716-691b-4f49-bbb6-e0486b2d2b31-kube-api-access-rdrgq" (OuterVolumeSpecName: "kube-api-access-rdrgq") pod "0f813716-691b-4f49-bbb6-e0486b2d2b31" (UID: "0f813716-691b-4f49-bbb6-e0486b2d2b31"). InnerVolumeSpecName "kube-api-access-rdrgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.126822 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/856fa1f5-ef4a-4122-9a28-cabfe353eaeb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "856fa1f5-ef4a-4122-9a28-cabfe353eaeb" (UID: "856fa1f5-ef4a-4122-9a28-cabfe353eaeb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.128039 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/856fa1f5-ef4a-4122-9a28-cabfe353eaeb-kube-api-access-tbjpp" (OuterVolumeSpecName: "kube-api-access-tbjpp") pod "856fa1f5-ef4a-4122-9a28-cabfe353eaeb" (UID: "856fa1f5-ef4a-4122-9a28-cabfe353eaeb"). InnerVolumeSpecName "kube-api-access-tbjpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.141275 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550806-qqlhx"] Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.145435 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f813716-691b-4f49-bbb6-e0486b2d2b31-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f813716-691b-4f49-bbb6-e0486b2d2b31" (UID: "0f813716-691b-4f49-bbb6-e0486b2d2b31"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:01 crc kubenswrapper[4792]: W0309 09:26:01.161646 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dcbe3d4_bd2e_4a4b_864a_77b49c015354.slice/crio-96a48cbe0a554a0317290ba4427a4b299a5534f8020bda0ba6ef227154467092 WatchSource:0}: Error finding container 96a48cbe0a554a0317290ba4427a4b299a5534f8020bda0ba6ef227154467092: Status 404 returned error can't find the container with id 96a48cbe0a554a0317290ba4427a4b299a5534f8020bda0ba6ef227154467092 Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.192885 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-71f9-account-create-update-5wltv" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.203704 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-eaf0-account-create-update-b2b57" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.217440 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnbdq\" (UniqueName: \"kubernetes.io/projected/cd58f0df-aee5-44be-ae2a-33033696a043-kube-api-access-vnbdq\") pod \"cd58f0df-aee5-44be-ae2a-33033696a043\" (UID: \"cd58f0df-aee5-44be-ae2a-33033696a043\") " Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.217477 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144-operator-scripts\") pod \"0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144\" (UID: \"0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144\") " Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.217536 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd58f0df-aee5-44be-ae2a-33033696a043-operator-scripts\") pod \"cd58f0df-aee5-44be-ae2a-33033696a043\" (UID: \"cd58f0df-aee5-44be-ae2a-33033696a043\") " Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.217699 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s2mg\" (UniqueName: \"kubernetes.io/projected/0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144-kube-api-access-5s2mg\") pod \"0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144\" (UID: \"0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144\") " Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.218016 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/856fa1f5-ef4a-4122-9a28-cabfe353eaeb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.218034 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbjpp\" (UniqueName: \"kubernetes.io/projected/856fa1f5-ef4a-4122-9a28-cabfe353eaeb-kube-api-access-tbjpp\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.218044 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdrgq\" (UniqueName: \"kubernetes.io/projected/0f813716-691b-4f49-bbb6-e0486b2d2b31-kube-api-access-rdrgq\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.218054 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f813716-691b-4f49-bbb6-e0486b2d2b31-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.220666 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd58f0df-aee5-44be-ae2a-33033696a043-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd58f0df-aee5-44be-ae2a-33033696a043" (UID: "cd58f0df-aee5-44be-ae2a-33033696a043"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.221062 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144" (UID: "0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.240281 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd58f0df-aee5-44be-ae2a-33033696a043-kube-api-access-vnbdq" (OuterVolumeSpecName: "kube-api-access-vnbdq") pod "cd58f0df-aee5-44be-ae2a-33033696a043" (UID: "cd58f0df-aee5-44be-ae2a-33033696a043"). InnerVolumeSpecName "kube-api-access-vnbdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.240364 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144-kube-api-access-5s2mg" (OuterVolumeSpecName: "kube-api-access-5s2mg") pod "0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144" (UID: "0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144"). InnerVolumeSpecName "kube-api-access-5s2mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.319485 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s2mg\" (UniqueName: \"kubernetes.io/projected/0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144-kube-api-access-5s2mg\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.319517 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnbdq\" (UniqueName: \"kubernetes.io/projected/cd58f0df-aee5-44be-ae2a-33033696a043-kube-api-access-vnbdq\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.319529 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.319538 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd58f0df-aee5-44be-ae2a-33033696a043-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.564553 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzfgc" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.565111 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzfgc" event={"ID":"856fa1f5-ef4a-4122-9a28-cabfe353eaeb","Type":"ContainerDied","Data":"9904115b11e246e3767df40133fe82c683f61d230e424780bf2a13e620c3030d"} Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.565145 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9904115b11e246e3767df40133fe82c683f61d230e424780bf2a13e620c3030d" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.566443 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550806-qqlhx" event={"ID":"9dcbe3d4-bd2e-4a4b-864a-77b49c015354","Type":"ContainerStarted","Data":"96a48cbe0a554a0317290ba4427a4b299a5534f8020bda0ba6ef227154467092"} Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.568952 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s8k8r" event={"ID":"0f813716-691b-4f49-bbb6-e0486b2d2b31","Type":"ContainerDied","Data":"678c961893220ddaeed11eb63646dccd9603b03907713dc83759f2e8d5ef6f92"} Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.568979 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="678c961893220ddaeed11eb63646dccd9603b03907713dc83759f2e8d5ef6f92" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.569060 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s8k8r" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.572341 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-71f9-account-create-update-5wltv" event={"ID":"cd58f0df-aee5-44be-ae2a-33033696a043","Type":"ContainerDied","Data":"c9c1c13e07f0fb729210f4772760849dd828a08132c5915e665c752e9bf875d8"} Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.572502 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9c1c13e07f0fb729210f4772760849dd828a08132c5915e665c752e9bf875d8" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.572829 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-71f9-account-create-update-5wltv" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.580395 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-eaf0-account-create-update-b2b57" event={"ID":"0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144","Type":"ContainerDied","Data":"5b617ab93ed22b8d94a0ee960a0c716284f424edba73fb6ccc85378593b6696c"} Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.580441 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b617ab93ed22b8d94a0ee960a0c716284f424edba73fb6ccc85378593b6696c" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.580507 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-eaf0-account-create-update-b2b57" Mar 09 09:26:01 crc kubenswrapper[4792]: I0309 09:26:01.675660 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e714f5e-5cca-4397-8a64-244ecfad9907" path="/var/lib/kubelet/pods/6e714f5e-5cca-4397-8a64-244ecfad9907/volumes" Mar 09 09:26:02 crc kubenswrapper[4792]: I0309 09:26:02.007667 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kj9d8" podUID="438d928b-7565-4fe1-a005-2c6402835edf" containerName="ovn-controller" probeResult="failure" output=< Mar 09 09:26:02 crc kubenswrapper[4792]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 09 09:26:02 crc kubenswrapper[4792]: > Mar 09 09:26:02 crc kubenswrapper[4792]: I0309 09:26:02.592127 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550806-qqlhx" event={"ID":"9dcbe3d4-bd2e-4a4b-864a-77b49c015354","Type":"ContainerStarted","Data":"1e30ee26135f2ea4c44b7eccafc3705bb1758ba9a51c12a37263b7f72efd60c8"} Mar 09 09:26:02 crc kubenswrapper[4792]: I0309 09:26:02.611663 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550806-qqlhx" podStartSLOduration=1.76407475 podStartE2EDuration="2.611646707s" podCreationTimestamp="2026-03-09 09:26:00 +0000 UTC" firstStartedPulling="2026-03-09 09:26:01.168283333 +0000 UTC m=+1126.198484085" lastFinishedPulling="2026-03-09 09:26:02.01585529 +0000 UTC m=+1127.046056042" observedRunningTime="2026-03-09 09:26:02.606614241 +0000 UTC m=+1127.636815033" watchObservedRunningTime="2026-03-09 09:26:02.611646707 +0000 UTC m=+1127.641847459" Mar 09 09:26:03 crc kubenswrapper[4792]: I0309 09:26:03.600394 4792 generic.go:334] "Generic (PLEG): container finished" podID="9dcbe3d4-bd2e-4a4b-864a-77b49c015354" containerID="1e30ee26135f2ea4c44b7eccafc3705bb1758ba9a51c12a37263b7f72efd60c8" exitCode=0 Mar 09 09:26:03 crc kubenswrapper[4792]: I0309 09:26:03.600478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550806-qqlhx" event={"ID":"9dcbe3d4-bd2e-4a4b-864a-77b49c015354","Type":"ContainerDied","Data":"1e30ee26135f2ea4c44b7eccafc3705bb1758ba9a51c12a37263b7f72efd60c8"} Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.608879 4792 generic.go:334] "Generic (PLEG): container finished" podID="0ee86e97-a22c-4089-9ce4-363cb0571173" containerID="dbd17cbb8b429cdcb0b12d986092a0771430752ae9708cfa2b6450eb12120d9f" exitCode=0 Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.608982 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ee86e97-a22c-4089-9ce4-363cb0571173","Type":"ContainerDied","Data":"dbd17cbb8b429cdcb0b12d986092a0771430752ae9708cfa2b6450eb12120d9f"} Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.921559 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zknkp"] Mar 09 09:26:04 crc kubenswrapper[4792]: E0309 09:26:04.922349 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd58f0df-aee5-44be-ae2a-33033696a043" containerName="mariadb-account-create-update" Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.922449 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd58f0df-aee5-44be-ae2a-33033696a043" containerName="mariadb-account-create-update" Mar 09 09:26:04 crc kubenswrapper[4792]: E0309 09:26:04.922640 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144" containerName="mariadb-account-create-update" Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.922723 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144" containerName="mariadb-account-create-update" Mar 09 09:26:04 crc kubenswrapper[4792]: E0309 09:26:04.922844 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856fa1f5-ef4a-4122-9a28-cabfe353eaeb" containerName="mariadb-database-create" Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.922923 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="856fa1f5-ef4a-4122-9a28-cabfe353eaeb" containerName="mariadb-database-create" Mar 09 09:26:04 crc kubenswrapper[4792]: E0309 09:26:04.923004 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f813716-691b-4f49-bbb6-e0486b2d2b31" containerName="mariadb-database-create" Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.923130 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f813716-691b-4f49-bbb6-e0486b2d2b31" containerName="mariadb-database-create" Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.923498 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144" containerName="mariadb-account-create-update" Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.923597 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="856fa1f5-ef4a-4122-9a28-cabfe353eaeb" containerName="mariadb-database-create" Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.923699 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd58f0df-aee5-44be-ae2a-33033696a043" containerName="mariadb-account-create-update" Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.923780 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f813716-691b-4f49-bbb6-e0486b2d2b31" containerName="mariadb-database-create" Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.924456 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zknkp" Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.929338 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 09 09:26:04 crc kubenswrapper[4792]: I0309 09:26:04.940966 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zknkp"] Mar 09 09:26:05 crc kubenswrapper[4792]: I0309 09:26:05.019084 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baba995c-6694-4351-9a7d-3078f73581e8-operator-scripts\") pod \"root-account-create-update-zknkp\" (UID: \"baba995c-6694-4351-9a7d-3078f73581e8\") " pod="openstack/root-account-create-update-zknkp" Mar 09 09:26:05 crc kubenswrapper[4792]: I0309 09:26:05.019131 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lplh\" (UniqueName: \"kubernetes.io/projected/baba995c-6694-4351-9a7d-3078f73581e8-kube-api-access-6lplh\") pod \"root-account-create-update-zknkp\" (UID: \"baba995c-6694-4351-9a7d-3078f73581e8\") " pod="openstack/root-account-create-update-zknkp" Mar 09 09:26:05 crc kubenswrapper[4792]: I0309 09:26:05.120958 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baba995c-6694-4351-9a7d-3078f73581e8-operator-scripts\") pod \"root-account-create-update-zknkp\" (UID: \"baba995c-6694-4351-9a7d-3078f73581e8\") " pod="openstack/root-account-create-update-zknkp" Mar 09 09:26:05 crc kubenswrapper[4792]: I0309 09:26:05.121007 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lplh\" (UniqueName: \"kubernetes.io/projected/baba995c-6694-4351-9a7d-3078f73581e8-kube-api-access-6lplh\") pod \"root-account-create-update-zknkp\" (UID: \"baba995c-6694-4351-9a7d-3078f73581e8\") " pod="openstack/root-account-create-update-zknkp" Mar 09 09:26:05 crc kubenswrapper[4792]: I0309 09:26:05.121848 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baba995c-6694-4351-9a7d-3078f73581e8-operator-scripts\") pod \"root-account-create-update-zknkp\" (UID: \"baba995c-6694-4351-9a7d-3078f73581e8\") " pod="openstack/root-account-create-update-zknkp" Mar 09 09:26:05 crc kubenswrapper[4792]: I0309 09:26:05.141633 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lplh\" (UniqueName: \"kubernetes.io/projected/baba995c-6694-4351-9a7d-3078f73581e8-kube-api-access-6lplh\") pod \"root-account-create-update-zknkp\" (UID: \"baba995c-6694-4351-9a7d-3078f73581e8\") " pod="openstack/root-account-create-update-zknkp" Mar 09 09:26:05 crc kubenswrapper[4792]: I0309 09:26:05.248241 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zknkp" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.021914 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kj9d8" podUID="438d928b-7565-4fe1-a005-2c6402835edf" containerName="ovn-controller" probeResult="failure" output=< Mar 09 09:26:07 crc kubenswrapper[4792]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 09 09:26:07 crc kubenswrapper[4792]: > Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.150741 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.151922 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gw65t" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.384659 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kj9d8-config-lfvzz"] Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.386018 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.392422 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.395710 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kj9d8-config-lfvzz"] Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.481558 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/919656cd-c4d9-48d1-8189-c9ed706c450e-additional-scripts\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.481619 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-log-ovn\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.481659 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rx28\" (UniqueName: \"kubernetes.io/projected/919656cd-c4d9-48d1-8189-c9ed706c450e-kube-api-access-4rx28\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.481712 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/919656cd-c4d9-48d1-8189-c9ed706c450e-scripts\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.481761 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-run\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.481798 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-run-ovn\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.583155 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/919656cd-c4d9-48d1-8189-c9ed706c450e-additional-scripts\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.583209 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-log-ovn\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.583248 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rx28\" (UniqueName: \"kubernetes.io/projected/919656cd-c4d9-48d1-8189-c9ed706c450e-kube-api-access-4rx28\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.583279 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/919656cd-c4d9-48d1-8189-c9ed706c450e-scripts\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.583302 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-run\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.583329 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-run-ovn\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.583628 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-run-ovn\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.584281 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-log-ovn\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.584338 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-run\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.584548 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/919656cd-c4d9-48d1-8189-c9ed706c450e-additional-scripts\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.585698 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/919656cd-c4d9-48d1-8189-c9ed706c450e-scripts\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.632006 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rx28\" (UniqueName: \"kubernetes.io/projected/919656cd-c4d9-48d1-8189-c9ed706c450e-kube-api-access-4rx28\") pod \"ovn-controller-kj9d8-config-lfvzz\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:07 crc kubenswrapper[4792]: I0309 09:26:07.720589 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:08 crc kubenswrapper[4792]: I0309 09:26:08.653880 4792 generic.go:334] "Generic (PLEG): container finished" podID="42b40fb0-d2c9-4bc2-a13f-4c099b244ced" containerID="12ef7a7568725de4169d980eaebeaae0632c46d8f4718c7352b6c167ad607668" exitCode=0 Mar 09 09:26:08 crc kubenswrapper[4792]: I0309 09:26:08.653922 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"42b40fb0-d2c9-4bc2-a13f-4c099b244ced","Type":"ContainerDied","Data":"12ef7a7568725de4169d980eaebeaae0632c46d8f4718c7352b6c167ad607668"} Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.201767 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550806-qqlhx" Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.363664 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k58zf\" (UniqueName: \"kubernetes.io/projected/9dcbe3d4-bd2e-4a4b-864a-77b49c015354-kube-api-access-k58zf\") pod \"9dcbe3d4-bd2e-4a4b-864a-77b49c015354\" (UID: \"9dcbe3d4-bd2e-4a4b-864a-77b49c015354\") " Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.369965 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dcbe3d4-bd2e-4a4b-864a-77b49c015354-kube-api-access-k58zf" (OuterVolumeSpecName: "kube-api-access-k58zf") pod "9dcbe3d4-bd2e-4a4b-864a-77b49c015354" (UID: "9dcbe3d4-bd2e-4a4b-864a-77b49c015354"). InnerVolumeSpecName "kube-api-access-k58zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.467880 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k58zf\" (UniqueName: \"kubernetes.io/projected/9dcbe3d4-bd2e-4a4b-864a-77b49c015354-kube-api-access-k58zf\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.527683 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kj9d8-config-lfvzz"] Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.559467 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zknkp"] Mar 09 09:26:11 crc kubenswrapper[4792]: W0309 09:26:11.573265 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaba995c_6694_4351_9a7d_3078f73581e8.slice/crio-f94fceb825477dc038b0a552e5fda72d58d566cdace12278f3a2034f0881ffdb WatchSource:0}: Error finding container f94fceb825477dc038b0a552e5fda72d58d566cdace12278f3a2034f0881ffdb: Status 404 returned error can't find the container with id f94fceb825477dc038b0a552e5fda72d58d566cdace12278f3a2034f0881ffdb Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.678427 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zknkp" event={"ID":"baba995c-6694-4351-9a7d-3078f73581e8","Type":"ContainerStarted","Data":"f94fceb825477dc038b0a552e5fda72d58d566cdace12278f3a2034f0881ffdb"} Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.682698 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"42b40fb0-d2c9-4bc2-a13f-4c099b244ced","Type":"ContainerStarted","Data":"6147ea0cd9e780b17277a374eaf8973eb9f7669cf3ea52c694431c1760b967fb"} Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.683757 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.690389 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kj9d8-config-lfvzz" event={"ID":"919656cd-c4d9-48d1-8189-c9ed706c450e","Type":"ContainerStarted","Data":"bca2319269cad335d0780f89efab13f1c4387efb8bd528f7753553dde163c7b1"} Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.694634 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550806-qqlhx" event={"ID":"9dcbe3d4-bd2e-4a4b-864a-77b49c015354","Type":"ContainerDied","Data":"96a48cbe0a554a0317290ba4427a4b299a5534f8020bda0ba6ef227154467092"} Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.694672 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96a48cbe0a554a0317290ba4427a4b299a5534f8020bda0ba6ef227154467092" Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.694737 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550806-qqlhx" Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.699310 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ee86e97-a22c-4089-9ce4-363cb0571173","Type":"ContainerStarted","Data":"de75c85de9aabbef9dad206b8b8770b3869d0bbae34d43ac2ac1e1373e6245fe"} Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.700437 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.710450 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371957.144344 podStartE2EDuration="1m19.710431583s" podCreationTimestamp="2026-03-09 09:24:52 +0000 UTC" firstStartedPulling="2026-03-09 09:24:54.922182213 +0000 UTC m=+1059.952382965" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:26:11.710109584 +0000 UTC m=+1136.740310346" watchObservedRunningTime="2026-03-09 09:26:11.710431583 +0000 UTC m=+1136.740632335" Mar 09 09:26:11 crc kubenswrapper[4792]: I0309 09:26:11.758872 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=44.08958213 podStartE2EDuration="1m19.758851569s" podCreationTimestamp="2026-03-09 09:24:52 +0000 UTC" firstStartedPulling="2026-03-09 09:24:54.457504189 +0000 UTC m=+1059.487704941" lastFinishedPulling="2026-03-09 09:25:30.126773628 +0000 UTC m=+1095.156974380" observedRunningTime="2026-03-09 09:26:11.746871201 +0000 UTC m=+1136.777071963" watchObservedRunningTime="2026-03-09 09:26:11.758851569 +0000 UTC m=+1136.789052331" Mar 09 09:26:12 crc kubenswrapper[4792]: I0309 09:26:12.021531 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-kj9d8" Mar 09 09:26:12 crc kubenswrapper[4792]: I0309 09:26:12.283267 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550800-99fr8"] Mar 09 09:26:12 crc kubenswrapper[4792]: I0309 09:26:12.290105 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550800-99fr8"] Mar 09 09:26:12 crc kubenswrapper[4792]: I0309 09:26:12.707101 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tknqj" event={"ID":"efc711fe-a152-4902-af60-09a6bed9344a","Type":"ContainerStarted","Data":"2e3d6b7392ac5cc440f301bb1dfbdd7f36a6534e039d0779198434011c366841"} Mar 09 09:26:12 crc kubenswrapper[4792]: I0309 09:26:12.709059 4792 generic.go:334] "Generic (PLEG): container finished" podID="919656cd-c4d9-48d1-8189-c9ed706c450e" containerID="e96c99b444d6d824d7e4089cdf64b274e71d120f365bbbd16da758c9854e606d" exitCode=0 Mar 09 09:26:12 crc kubenswrapper[4792]: I0309 09:26:12.709124 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kj9d8-config-lfvzz" event={"ID":"919656cd-c4d9-48d1-8189-c9ed706c450e","Type":"ContainerDied","Data":"e96c99b444d6d824d7e4089cdf64b274e71d120f365bbbd16da758c9854e606d"} Mar 09 09:26:12 crc kubenswrapper[4792]: I0309 09:26:12.710449 4792 generic.go:334] "Generic (PLEG): container finished" podID="baba995c-6694-4351-9a7d-3078f73581e8" containerID="1626f0438cbc7e0d86aae5e1aa7304448648a0c8624c8d7f1a59e46cc1c74e68" exitCode=0 Mar 09 09:26:12 crc kubenswrapper[4792]: I0309 09:26:12.711188 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zknkp" event={"ID":"baba995c-6694-4351-9a7d-3078f73581e8","Type":"ContainerDied","Data":"1626f0438cbc7e0d86aae5e1aa7304448648a0c8624c8d7f1a59e46cc1c74e68"} Mar 09 09:26:12 crc kubenswrapper[4792]: I0309 09:26:12.728530 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-tknqj" podStartSLOduration=2.856680295 podStartE2EDuration="15.72850999s" podCreationTimestamp="2026-03-09 09:25:57 +0000 UTC" firstStartedPulling="2026-03-09 09:25:58.226287771 +0000 UTC m=+1123.256488523" lastFinishedPulling="2026-03-09 09:26:11.098117466 +0000 UTC m=+1136.128318218" observedRunningTime="2026-03-09 09:26:12.72335711 +0000 UTC m=+1137.753557862" watchObservedRunningTime="2026-03-09 09:26:12.72850999 +0000 UTC m=+1137.758710742" Mar 09 09:26:13 crc kubenswrapper[4792]: I0309 09:26:13.673988 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d" path="/var/lib/kubelet/pods/1058fa4d-1b24-4f5a-b99b-0f4b2b6f041d/volumes" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.131619 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zknkp" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.138304 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.217355 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-run\") pod \"919656cd-c4d9-48d1-8189-c9ed706c450e\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.217452 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-log-ovn\") pod \"919656cd-c4d9-48d1-8189-c9ed706c450e\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.217448 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-run" (OuterVolumeSpecName: "var-run") pod "919656cd-c4d9-48d1-8189-c9ed706c450e" (UID: "919656cd-c4d9-48d1-8189-c9ed706c450e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.217484 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/919656cd-c4d9-48d1-8189-c9ed706c450e-additional-scripts\") pod \"919656cd-c4d9-48d1-8189-c9ed706c450e\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.217505 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lplh\" (UniqueName: \"kubernetes.io/projected/baba995c-6694-4351-9a7d-3078f73581e8-kube-api-access-6lplh\") pod \"baba995c-6694-4351-9a7d-3078f73581e8\" (UID: \"baba995c-6694-4351-9a7d-3078f73581e8\") " Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.217511 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "919656cd-c4d9-48d1-8189-c9ed706c450e" (UID: "919656cd-c4d9-48d1-8189-c9ed706c450e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.217577 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/919656cd-c4d9-48d1-8189-c9ed706c450e-scripts\") pod \"919656cd-c4d9-48d1-8189-c9ed706c450e\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.217599 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rx28\" (UniqueName: \"kubernetes.io/projected/919656cd-c4d9-48d1-8189-c9ed706c450e-kube-api-access-4rx28\") pod \"919656cd-c4d9-48d1-8189-c9ed706c450e\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.217622 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baba995c-6694-4351-9a7d-3078f73581e8-operator-scripts\") pod \"baba995c-6694-4351-9a7d-3078f73581e8\" (UID: \"baba995c-6694-4351-9a7d-3078f73581e8\") " Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.217690 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-run-ovn\") pod \"919656cd-c4d9-48d1-8189-c9ed706c450e\" (UID: \"919656cd-c4d9-48d1-8189-c9ed706c450e\") " Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.217968 4792 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-run\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.217981 4792 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.218037 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "919656cd-c4d9-48d1-8189-c9ed706c450e" (UID: "919656cd-c4d9-48d1-8189-c9ed706c450e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.218280 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919656cd-c4d9-48d1-8189-c9ed706c450e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "919656cd-c4d9-48d1-8189-c9ed706c450e" (UID: "919656cd-c4d9-48d1-8189-c9ed706c450e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.218951 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baba995c-6694-4351-9a7d-3078f73581e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "baba995c-6694-4351-9a7d-3078f73581e8" (UID: "baba995c-6694-4351-9a7d-3078f73581e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.219253 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/919656cd-c4d9-48d1-8189-c9ed706c450e-scripts" (OuterVolumeSpecName: "scripts") pod "919656cd-c4d9-48d1-8189-c9ed706c450e" (UID: "919656cd-c4d9-48d1-8189-c9ed706c450e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.224739 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919656cd-c4d9-48d1-8189-c9ed706c450e-kube-api-access-4rx28" (OuterVolumeSpecName: "kube-api-access-4rx28") pod "919656cd-c4d9-48d1-8189-c9ed706c450e" (UID: "919656cd-c4d9-48d1-8189-c9ed706c450e"). InnerVolumeSpecName "kube-api-access-4rx28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.243791 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baba995c-6694-4351-9a7d-3078f73581e8-kube-api-access-6lplh" (OuterVolumeSpecName: "kube-api-access-6lplh") pod "baba995c-6694-4351-9a7d-3078f73581e8" (UID: "baba995c-6694-4351-9a7d-3078f73581e8"). InnerVolumeSpecName "kube-api-access-6lplh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.321233 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/919656cd-c4d9-48d1-8189-c9ed706c450e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.321271 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rx28\" (UniqueName: \"kubernetes.io/projected/919656cd-c4d9-48d1-8189-c9ed706c450e-kube-api-access-4rx28\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.321285 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baba995c-6694-4351-9a7d-3078f73581e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.321294 4792 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/919656cd-c4d9-48d1-8189-c9ed706c450e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.321303 4792 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/919656cd-c4d9-48d1-8189-c9ed706c450e-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.321313 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lplh\" (UniqueName: \"kubernetes.io/projected/baba995c-6694-4351-9a7d-3078f73581e8-kube-api-access-6lplh\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.729712 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zknkp" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.729917 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zknkp" event={"ID":"baba995c-6694-4351-9a7d-3078f73581e8","Type":"ContainerDied","Data":"f94fceb825477dc038b0a552e5fda72d58d566cdace12278f3a2034f0881ffdb"} Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.729961 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f94fceb825477dc038b0a552e5fda72d58d566cdace12278f3a2034f0881ffdb" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.731396 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kj9d8-config-lfvzz" event={"ID":"919656cd-c4d9-48d1-8189-c9ed706c450e","Type":"ContainerDied","Data":"bca2319269cad335d0780f89efab13f1c4387efb8bd528f7753553dde163c7b1"} Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.731434 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bca2319269cad335d0780f89efab13f1c4387efb8bd528f7753553dde163c7b1" Mar 09 09:26:14 crc kubenswrapper[4792]: I0309 09:26:14.731464 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kj9d8-config-lfvzz" Mar 09 09:26:15 crc kubenswrapper[4792]: I0309 09:26:15.242631 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kj9d8-config-lfvzz"] Mar 09 09:26:15 crc kubenswrapper[4792]: I0309 09:26:15.249604 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-kj9d8-config-lfvzz"] Mar 09 09:26:15 crc kubenswrapper[4792]: I0309 09:26:15.673846 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919656cd-c4d9-48d1-8189-c9ed706c450e" path="/var/lib/kubelet/pods/919656cd-c4d9-48d1-8189-c9ed706c450e/volumes" Mar 09 09:26:19 crc kubenswrapper[4792]: I0309 09:26:19.769925 4792 generic.go:334] "Generic (PLEG): container finished" podID="efc711fe-a152-4902-af60-09a6bed9344a" containerID="2e3d6b7392ac5cc440f301bb1dfbdd7f36a6534e039d0779198434011c366841" exitCode=0 Mar 09 09:26:19 crc kubenswrapper[4792]: I0309 09:26:19.770021 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tknqj" event={"ID":"efc711fe-a152-4902-af60-09a6bed9344a","Type":"ContainerDied","Data":"2e3d6b7392ac5cc440f301bb1dfbdd7f36a6534e039d0779198434011c366841"} Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.210249 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tknqj" Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.327802 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-db-sync-config-data\") pod \"efc711fe-a152-4902-af60-09a6bed9344a\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.328209 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-combined-ca-bundle\") pod \"efc711fe-a152-4902-af60-09a6bed9344a\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.328307 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-config-data\") pod \"efc711fe-a152-4902-af60-09a6bed9344a\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.328447 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjw6p\" (UniqueName: \"kubernetes.io/projected/efc711fe-a152-4902-af60-09a6bed9344a-kube-api-access-wjw6p\") pod \"efc711fe-a152-4902-af60-09a6bed9344a\" (UID: \"efc711fe-a152-4902-af60-09a6bed9344a\") " Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.336857 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "efc711fe-a152-4902-af60-09a6bed9344a" (UID: "efc711fe-a152-4902-af60-09a6bed9344a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.343334 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc711fe-a152-4902-af60-09a6bed9344a-kube-api-access-wjw6p" (OuterVolumeSpecName: "kube-api-access-wjw6p") pod "efc711fe-a152-4902-af60-09a6bed9344a" (UID: "efc711fe-a152-4902-af60-09a6bed9344a"). InnerVolumeSpecName "kube-api-access-wjw6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.365360 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efc711fe-a152-4902-af60-09a6bed9344a" (UID: "efc711fe-a152-4902-af60-09a6bed9344a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.382368 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-config-data" (OuterVolumeSpecName: "config-data") pod "efc711fe-a152-4902-af60-09a6bed9344a" (UID: "efc711fe-a152-4902-af60-09a6bed9344a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.429891 4792 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.429941 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.429956 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efc711fe-a152-4902-af60-09a6bed9344a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.429969 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjw6p\" (UniqueName: \"kubernetes.io/projected/efc711fe-a152-4902-af60-09a6bed9344a-kube-api-access-wjw6p\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.789760 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-tknqj" event={"ID":"efc711fe-a152-4902-af60-09a6bed9344a","Type":"ContainerDied","Data":"5d945ad85f932e4efa5ef1bf405013d1afd8579f0a5d3648277e5a832c91fd79"} Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.790095 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d945ad85f932e4efa5ef1bf405013d1afd8579f0a5d3648277e5a832c91fd79" Mar 09 09:26:21 crc kubenswrapper[4792]: I0309 09:26:21.789801 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-tknqj" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.319814 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-hkbdp"] Mar 09 09:26:22 crc kubenswrapper[4792]: E0309 09:26:22.333246 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919656cd-c4d9-48d1-8189-c9ed706c450e" containerName="ovn-config" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.333283 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="919656cd-c4d9-48d1-8189-c9ed706c450e" containerName="ovn-config" Mar 09 09:26:22 crc kubenswrapper[4792]: E0309 09:26:22.333309 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baba995c-6694-4351-9a7d-3078f73581e8" containerName="mariadb-account-create-update" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.333319 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="baba995c-6694-4351-9a7d-3078f73581e8" containerName="mariadb-account-create-update" Mar 09 09:26:22 crc kubenswrapper[4792]: E0309 09:26:22.333353 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dcbe3d4-bd2e-4a4b-864a-77b49c015354" containerName="oc" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.333360 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dcbe3d4-bd2e-4a4b-864a-77b49c015354" containerName="oc" Mar 09 09:26:22 crc kubenswrapper[4792]: E0309 09:26:22.333373 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc711fe-a152-4902-af60-09a6bed9344a" containerName="glance-db-sync" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.333380 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc711fe-a152-4902-af60-09a6bed9344a" containerName="glance-db-sync" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.333626 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="baba995c-6694-4351-9a7d-3078f73581e8" containerName="mariadb-account-create-update" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.333643 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dcbe3d4-bd2e-4a4b-864a-77b49c015354" containerName="oc" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.333653 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc711fe-a152-4902-af60-09a6bed9344a" containerName="glance-db-sync" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.333661 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="919656cd-c4d9-48d1-8189-c9ed706c450e" containerName="ovn-config" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.334528 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.342601 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-hkbdp"] Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.450566 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-ovsdbserver-nb\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.450915 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-config\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.450945 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-ovsdbserver-sb\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.450981 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-dns-svc\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.450998 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2tt\" (UniqueName: \"kubernetes.io/projected/62d7f343-f742-4613-83b6-a4baf01d94c7-kube-api-access-vh2tt\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.552674 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-ovsdbserver-sb\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.552758 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-dns-svc\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.552778 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2tt\" (UniqueName: \"kubernetes.io/projected/62d7f343-f742-4613-83b6-a4baf01d94c7-kube-api-access-vh2tt\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.552917 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-ovsdbserver-nb\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.552945 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-config\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.553735 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-dns-svc\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.553742 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-config\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.553743 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-ovsdbserver-sb\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.554319 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-ovsdbserver-nb\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.577945 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2tt\" (UniqueName: \"kubernetes.io/projected/62d7f343-f742-4613-83b6-a4baf01d94c7-kube-api-access-vh2tt\") pod \"dnsmasq-dns-6958f867f9-hkbdp\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:22 crc kubenswrapper[4792]: I0309 09:26:22.651789 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:23 crc kubenswrapper[4792]: I0309 09:26:23.136748 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-hkbdp"] Mar 09 09:26:23 crc kubenswrapper[4792]: I0309 09:26:23.823500 4792 generic.go:334] "Generic (PLEG): container finished" podID="62d7f343-f742-4613-83b6-a4baf01d94c7" containerID="a493238163ffaeefdd24f981531859f3b8d3597f9c8b7c54ab8347b3cd1ec8d8" exitCode=0 Mar 09 09:26:23 crc kubenswrapper[4792]: I0309 09:26:23.823682 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" event={"ID":"62d7f343-f742-4613-83b6-a4baf01d94c7","Type":"ContainerDied","Data":"a493238163ffaeefdd24f981531859f3b8d3597f9c8b7c54ab8347b3cd1ec8d8"} Mar 09 09:26:23 crc kubenswrapper[4792]: I0309 09:26:23.824036 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" event={"ID":"62d7f343-f742-4613-83b6-a4baf01d94c7","Type":"ContainerStarted","Data":"1dcdb92679fdbc1f97a33033454be0d821d3b07fe5b3eff2a75bda68e2fb055e"} Mar 09 09:26:23 crc kubenswrapper[4792]: I0309 09:26:23.844361 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:26:24 crc kubenswrapper[4792]: I0309 09:26:24.278388 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 09:26:24 crc kubenswrapper[4792]: I0309 09:26:24.830950 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" event={"ID":"62d7f343-f742-4613-83b6-a4baf01d94c7","Type":"ContainerStarted","Data":"c38b09a235359396f3b9abbafbb9d101aa2bc7a0898468ee23e944a8679c30e0"} Mar 09 09:26:24 crc kubenswrapper[4792]: I0309 09:26:24.832000 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:24 crc kubenswrapper[4792]: I0309 09:26:24.849312 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" podStartSLOduration=2.849294819 podStartE2EDuration="2.849294819s" podCreationTimestamp="2026-03-09 09:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:26:24.848216008 +0000 UTC m=+1149.878416790" watchObservedRunningTime="2026-03-09 09:26:24.849294819 +0000 UTC m=+1149.879495571" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.264441 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-hwgng"] Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.265501 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hwgng" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.330317 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88rpv\" (UniqueName: \"kubernetes.io/projected/dda26326-39c5-4d44-9b41-7b9cf206b4ac-kube-api-access-88rpv\") pod \"cinder-db-create-hwgng\" (UID: \"dda26326-39c5-4d44-9b41-7b9cf206b4ac\") " pod="openstack/cinder-db-create-hwgng" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.330631 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dda26326-39c5-4d44-9b41-7b9cf206b4ac-operator-scripts\") pod \"cinder-db-create-hwgng\" (UID: \"dda26326-39c5-4d44-9b41-7b9cf206b4ac\") " pod="openstack/cinder-db-create-hwgng" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.347584 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hwgng"] Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.431856 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88rpv\" (UniqueName: \"kubernetes.io/projected/dda26326-39c5-4d44-9b41-7b9cf206b4ac-kube-api-access-88rpv\") pod \"cinder-db-create-hwgng\" (UID: \"dda26326-39c5-4d44-9b41-7b9cf206b4ac\") " pod="openstack/cinder-db-create-hwgng" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.431945 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dda26326-39c5-4d44-9b41-7b9cf206b4ac-operator-scripts\") pod \"cinder-db-create-hwgng\" (UID: \"dda26326-39c5-4d44-9b41-7b9cf206b4ac\") " pod="openstack/cinder-db-create-hwgng" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.432673 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dda26326-39c5-4d44-9b41-7b9cf206b4ac-operator-scripts\") pod \"cinder-db-create-hwgng\" (UID: \"dda26326-39c5-4d44-9b41-7b9cf206b4ac\") " pod="openstack/cinder-db-create-hwgng" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.457582 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8a45-account-create-update-hss2q"] Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.462964 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8a45-account-create-update-hss2q" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.472135 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88rpv\" (UniqueName: \"kubernetes.io/projected/dda26326-39c5-4d44-9b41-7b9cf206b4ac-kube-api-access-88rpv\") pod \"cinder-db-create-hwgng\" (UID: \"dda26326-39c5-4d44-9b41-7b9cf206b4ac\") " pod="openstack/cinder-db-create-hwgng" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.472313 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.483420 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8a45-account-create-update-hss2q"] Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.532874 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngc2b\" (UniqueName: \"kubernetes.io/projected/7ba4128e-9cd3-4ec9-a955-230ca12f4fed-kube-api-access-ngc2b\") pod \"cinder-8a45-account-create-update-hss2q\" (UID: \"7ba4128e-9cd3-4ec9-a955-230ca12f4fed\") " pod="openstack/cinder-8a45-account-create-update-hss2q" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.532969 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba4128e-9cd3-4ec9-a955-230ca12f4fed-operator-scripts\") pod \"cinder-8a45-account-create-update-hss2q\" (UID: \"7ba4128e-9cd3-4ec9-a955-230ca12f4fed\") " pod="openstack/cinder-8a45-account-create-update-hss2q" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.586397 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hwgng" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.617999 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-kcl6t"] Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.620429 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kcl6t" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.656803 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9dr8\" (UniqueName: \"kubernetes.io/projected/21c7264c-a882-4413-8e6a-c42ca77b7153-kube-api-access-b9dr8\") pod \"barbican-db-create-kcl6t\" (UID: \"21c7264c-a882-4413-8e6a-c42ca77b7153\") " pod="openstack/barbican-db-create-kcl6t" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.656863 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba4128e-9cd3-4ec9-a955-230ca12f4fed-operator-scripts\") pod \"cinder-8a45-account-create-update-hss2q\" (UID: \"7ba4128e-9cd3-4ec9-a955-230ca12f4fed\") " pod="openstack/cinder-8a45-account-create-update-hss2q" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.656943 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21c7264c-a882-4413-8e6a-c42ca77b7153-operator-scripts\") pod \"barbican-db-create-kcl6t\" (UID: \"21c7264c-a882-4413-8e6a-c42ca77b7153\") " pod="openstack/barbican-db-create-kcl6t" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.672463 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-kcl6t"] Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.673487 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba4128e-9cd3-4ec9-a955-230ca12f4fed-operator-scripts\") pod \"cinder-8a45-account-create-update-hss2q\" (UID: \"7ba4128e-9cd3-4ec9-a955-230ca12f4fed\") " pod="openstack/cinder-8a45-account-create-update-hss2q" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.690797 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngc2b\" (UniqueName: \"kubernetes.io/projected/7ba4128e-9cd3-4ec9-a955-230ca12f4fed-kube-api-access-ngc2b\") pod \"cinder-8a45-account-create-update-hss2q\" (UID: \"7ba4128e-9cd3-4ec9-a955-230ca12f4fed\") " pod="openstack/cinder-8a45-account-create-update-hss2q" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.713283 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-9w295"] Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.725509 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a92c-account-create-update-5547j"] Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.726741 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a92c-account-create-update-5547j" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.725632 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9w295" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.736752 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.756577 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9w295"] Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.757750 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngc2b\" (UniqueName: \"kubernetes.io/projected/7ba4128e-9cd3-4ec9-a955-230ca12f4fed-kube-api-access-ngc2b\") pod \"cinder-8a45-account-create-update-hss2q\" (UID: \"7ba4128e-9cd3-4ec9-a955-230ca12f4fed\") " pod="openstack/cinder-8a45-account-create-update-hss2q" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.770919 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a92c-account-create-update-5547j"] Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.801915 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5b2f92-52e4-4f31-a3a5-75f94a5cac77-operator-scripts\") pod \"neutron-db-create-9w295\" (UID: \"ff5b2f92-52e4-4f31-a3a5-75f94a5cac77\") " pod="openstack/neutron-db-create-9w295" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.802119 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21c7264c-a882-4413-8e6a-c42ca77b7153-operator-scripts\") pod \"barbican-db-create-kcl6t\" (UID: \"21c7264c-a882-4413-8e6a-c42ca77b7153\") " pod="openstack/barbican-db-create-kcl6t" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.802187 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mb28\" (UniqueName: \"kubernetes.io/projected/ff5b2f92-52e4-4f31-a3a5-75f94a5cac77-kube-api-access-2mb28\") pod \"neutron-db-create-9w295\" (UID: \"ff5b2f92-52e4-4f31-a3a5-75f94a5cac77\") " pod="openstack/neutron-db-create-9w295" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.802235 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c-operator-scripts\") pod \"barbican-a92c-account-create-update-5547j\" (UID: \"ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c\") " pod="openstack/barbican-a92c-account-create-update-5547j" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.802285 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbdbc\" (UniqueName: \"kubernetes.io/projected/ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c-kube-api-access-nbdbc\") pod \"barbican-a92c-account-create-update-5547j\" (UID: \"ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c\") " pod="openstack/barbican-a92c-account-create-update-5547j" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.802368 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9dr8\" (UniqueName: \"kubernetes.io/projected/21c7264c-a882-4413-8e6a-c42ca77b7153-kube-api-access-b9dr8\") pod \"barbican-db-create-kcl6t\" (UID: \"21c7264c-a882-4413-8e6a-c42ca77b7153\") " pod="openstack/barbican-db-create-kcl6t" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.803113 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21c7264c-a882-4413-8e6a-c42ca77b7153-operator-scripts\") pod \"barbican-db-create-kcl6t\" (UID: \"21c7264c-a882-4413-8e6a-c42ca77b7153\") " pod="openstack/barbican-db-create-kcl6t" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.831517 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9dr8\" (UniqueName: \"kubernetes.io/projected/21c7264c-a882-4413-8e6a-c42ca77b7153-kube-api-access-b9dr8\") pod \"barbican-db-create-kcl6t\" (UID: \"21c7264c-a882-4413-8e6a-c42ca77b7153\") " pod="openstack/barbican-db-create-kcl6t" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.855252 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8a45-account-create-update-hss2q" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.860651 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-97cb-account-create-update-kfnzz"] Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.861612 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-97cb-account-create-update-kfnzz" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.864796 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.906596 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5b2f92-52e4-4f31-a3a5-75f94a5cac77-operator-scripts\") pod \"neutron-db-create-9w295\" (UID: \"ff5b2f92-52e4-4f31-a3a5-75f94a5cac77\") " pod="openstack/neutron-db-create-9w295" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.906970 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mb28\" (UniqueName: \"kubernetes.io/projected/ff5b2f92-52e4-4f31-a3a5-75f94a5cac77-kube-api-access-2mb28\") pod \"neutron-db-create-9w295\" (UID: \"ff5b2f92-52e4-4f31-a3a5-75f94a5cac77\") " pod="openstack/neutron-db-create-9w295" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.906996 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c-operator-scripts\") pod \"barbican-a92c-account-create-update-5547j\" (UID: \"ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c\") " pod="openstack/barbican-a92c-account-create-update-5547j" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.907036 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbdbc\" (UniqueName: \"kubernetes.io/projected/ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c-kube-api-access-nbdbc\") pod \"barbican-a92c-account-create-update-5547j\" (UID: \"ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c\") " pod="openstack/barbican-a92c-account-create-update-5547j" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.908280 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5b2f92-52e4-4f31-a3a5-75f94a5cac77-operator-scripts\") pod \"neutron-db-create-9w295\" (UID: \"ff5b2f92-52e4-4f31-a3a5-75f94a5cac77\") " pod="openstack/neutron-db-create-9w295" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.917214 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c-operator-scripts\") pod \"barbican-a92c-account-create-update-5547j\" (UID: \"ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c\") " pod="openstack/barbican-a92c-account-create-update-5547j" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.921439 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-97cb-account-create-update-kfnzz"] Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.937830 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-267hg"] Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.939314 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-267hg" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.946659 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.946854 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.946856 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.947127 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cgfmm" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.948133 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbdbc\" (UniqueName: \"kubernetes.io/projected/ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c-kube-api-access-nbdbc\") pod \"barbican-a92c-account-create-update-5547j\" (UID: \"ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c\") " pod="openstack/barbican-a92c-account-create-update-5547j" Mar 09 09:26:26 crc kubenswrapper[4792]: I0309 09:26:26.977490 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mb28\" (UniqueName: \"kubernetes.io/projected/ff5b2f92-52e4-4f31-a3a5-75f94a5cac77-kube-api-access-2mb28\") pod \"neutron-db-create-9w295\" (UID: \"ff5b2f92-52e4-4f31-a3a5-75f94a5cac77\") " pod="openstack/neutron-db-create-9w295" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.005495 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-267hg"] Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.008058 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dfbc031-90cc-42ac-b543-febe1469f699-combined-ca-bundle\") pod \"keystone-db-sync-267hg\" (UID: \"3dfbc031-90cc-42ac-b543-febe1469f699\") " pod="openstack/keystone-db-sync-267hg" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.008243 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvznd\" (UniqueName: \"kubernetes.io/projected/3dfbc031-90cc-42ac-b543-febe1469f699-kube-api-access-bvznd\") pod \"keystone-db-sync-267hg\" (UID: \"3dfbc031-90cc-42ac-b543-febe1469f699\") " pod="openstack/keystone-db-sync-267hg" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.008336 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93a03de8-9516-4678-aafa-e4ce0b5c6e2b-operator-scripts\") pod \"neutron-97cb-account-create-update-kfnzz\" (UID: \"93a03de8-9516-4678-aafa-e4ce0b5c6e2b\") " pod="openstack/neutron-97cb-account-create-update-kfnzz" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.008551 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dfbc031-90cc-42ac-b543-febe1469f699-config-data\") pod \"keystone-db-sync-267hg\" (UID: \"3dfbc031-90cc-42ac-b543-febe1469f699\") " pod="openstack/keystone-db-sync-267hg" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.008695 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pc5w\" (UniqueName: \"kubernetes.io/projected/93a03de8-9516-4678-aafa-e4ce0b5c6e2b-kube-api-access-6pc5w\") pod \"neutron-97cb-account-create-update-kfnzz\" (UID: \"93a03de8-9516-4678-aafa-e4ce0b5c6e2b\") " pod="openstack/neutron-97cb-account-create-update-kfnzz" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.107191 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kcl6t" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.110334 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pc5w\" (UniqueName: \"kubernetes.io/projected/93a03de8-9516-4678-aafa-e4ce0b5c6e2b-kube-api-access-6pc5w\") pod \"neutron-97cb-account-create-update-kfnzz\" (UID: \"93a03de8-9516-4678-aafa-e4ce0b5c6e2b\") " pod="openstack/neutron-97cb-account-create-update-kfnzz" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.110396 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dfbc031-90cc-42ac-b543-febe1469f699-combined-ca-bundle\") pod \"keystone-db-sync-267hg\" (UID: \"3dfbc031-90cc-42ac-b543-febe1469f699\") " pod="openstack/keystone-db-sync-267hg" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.110419 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvznd\" (UniqueName: \"kubernetes.io/projected/3dfbc031-90cc-42ac-b543-febe1469f699-kube-api-access-bvznd\") pod \"keystone-db-sync-267hg\" (UID: \"3dfbc031-90cc-42ac-b543-febe1469f699\") " pod="openstack/keystone-db-sync-267hg" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.110444 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93a03de8-9516-4678-aafa-e4ce0b5c6e2b-operator-scripts\") pod \"neutron-97cb-account-create-update-kfnzz\" (UID: \"93a03de8-9516-4678-aafa-e4ce0b5c6e2b\") " pod="openstack/neutron-97cb-account-create-update-kfnzz" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.111390 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dfbc031-90cc-42ac-b543-febe1469f699-config-data\") pod \"keystone-db-sync-267hg\" (UID: \"3dfbc031-90cc-42ac-b543-febe1469f699\") " pod="openstack/keystone-db-sync-267hg" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.111405 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93a03de8-9516-4678-aafa-e4ce0b5c6e2b-operator-scripts\") pod \"neutron-97cb-account-create-update-kfnzz\" (UID: \"93a03de8-9516-4678-aafa-e4ce0b5c6e2b\") " pod="openstack/neutron-97cb-account-create-update-kfnzz" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.120618 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dfbc031-90cc-42ac-b543-febe1469f699-combined-ca-bundle\") pod \"keystone-db-sync-267hg\" (UID: \"3dfbc031-90cc-42ac-b543-febe1469f699\") " pod="openstack/keystone-db-sync-267hg" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.128861 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dfbc031-90cc-42ac-b543-febe1469f699-config-data\") pod \"keystone-db-sync-267hg\" (UID: \"3dfbc031-90cc-42ac-b543-febe1469f699\") " pod="openstack/keystone-db-sync-267hg" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.135920 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvznd\" (UniqueName: \"kubernetes.io/projected/3dfbc031-90cc-42ac-b543-febe1469f699-kube-api-access-bvznd\") pod \"keystone-db-sync-267hg\" (UID: \"3dfbc031-90cc-42ac-b543-febe1469f699\") " pod="openstack/keystone-db-sync-267hg" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.139098 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pc5w\" (UniqueName: \"kubernetes.io/projected/93a03de8-9516-4678-aafa-e4ce0b5c6e2b-kube-api-access-6pc5w\") pod \"neutron-97cb-account-create-update-kfnzz\" (UID: \"93a03de8-9516-4678-aafa-e4ce0b5c6e2b\") " pod="openstack/neutron-97cb-account-create-update-kfnzz" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.181421 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hwgng"] Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.190592 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a92c-account-create-update-5547j" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.243323 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9w295" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.267471 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-97cb-account-create-update-kfnzz" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.286890 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-267hg" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.610402 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8a45-account-create-update-hss2q"] Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.813449 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-kcl6t"] Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.903622 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hwgng" event={"ID":"dda26326-39c5-4d44-9b41-7b9cf206b4ac","Type":"ContainerStarted","Data":"1ce2f0989d353fbf89243b76a683bd93a6057b364bb790ff3c931ade1fc5a8c5"} Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.903916 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hwgng" event={"ID":"dda26326-39c5-4d44-9b41-7b9cf206b4ac","Type":"ContainerStarted","Data":"b03f240b39a51e0ccf48559a1f8497f5fe76a8395268ea78722d05154f86c624"} Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.906714 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kcl6t" event={"ID":"21c7264c-a882-4413-8e6a-c42ca77b7153","Type":"ContainerStarted","Data":"2bc847c82d4e73c8f3d1d9ca1e352e77cb3b53f163d460ffc960848853569ffa"} Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.908234 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8a45-account-create-update-hss2q" event={"ID":"7ba4128e-9cd3-4ec9-a955-230ca12f4fed","Type":"ContainerStarted","Data":"c64a09d698d754dfcf6f00b6ad6b9e64c5395f5b974f71e4d457784130f66b99"} Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.938335 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-hwgng" podStartSLOduration=1.93832007 podStartE2EDuration="1.93832007s" podCreationTimestamp="2026-03-09 09:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:26:27.933878881 +0000 UTC m=+1152.964079633" watchObservedRunningTime="2026-03-09 09:26:27.93832007 +0000 UTC m=+1152.968520822" Mar 09 09:26:27 crc kubenswrapper[4792]: I0309 09:26:27.965905 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a92c-account-create-update-5547j"] Mar 09 09:26:28 crc kubenswrapper[4792]: I0309 09:26:28.111105 4792 scope.go:117] "RemoveContainer" containerID="f319a389f728f78767a6ee2c87b636f3af093985e69ff0e0fe771ff633407050" Mar 09 09:26:28 crc kubenswrapper[4792]: I0309 09:26:28.141797 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-9w295"] Mar 09 09:26:28 crc kubenswrapper[4792]: I0309 09:26:28.171301 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-267hg"] Mar 09 09:26:28 crc kubenswrapper[4792]: W0309 09:26:28.198854 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfbc031_90cc_42ac_b543_febe1469f699.slice/crio-b4366093e3b0ca77425eb386f91854bc163a13f323eed682f2cc36a9107bb7ca WatchSource:0}: Error finding container b4366093e3b0ca77425eb386f91854bc163a13f323eed682f2cc36a9107bb7ca: Status 404 returned error can't find the container with id b4366093e3b0ca77425eb386f91854bc163a13f323eed682f2cc36a9107bb7ca Mar 09 09:26:28 crc kubenswrapper[4792]: W0309 09:26:28.200171 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff5b2f92_52e4_4f31_a3a5_75f94a5cac77.slice/crio-d62e4ff612f53f3fa16bf79a98a9384dfea17eab4bf71dd9c92965b2dfffb01a WatchSource:0}: Error finding container d62e4ff612f53f3fa16bf79a98a9384dfea17eab4bf71dd9c92965b2dfffb01a: Status 404 returned error can't find the container with id d62e4ff612f53f3fa16bf79a98a9384dfea17eab4bf71dd9c92965b2dfffb01a Mar 09 09:26:28 crc kubenswrapper[4792]: E0309 09:26:28.291452 4792 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.200:34648->38.102.83.200:45325: write tcp 38.102.83.200:34648->38.102.83.200:45325: write: broken pipe Mar 09 09:26:28 crc kubenswrapper[4792]: I0309 09:26:28.306579 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-97cb-account-create-update-kfnzz"] Mar 09 09:26:28 crc kubenswrapper[4792]: W0309 09:26:28.317288 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93a03de8_9516_4678_aafa_e4ce0b5c6e2b.slice/crio-442a2fd30390ad933531da50072703bb5362ec14153b058ef92b8b7892d3d829 WatchSource:0}: Error finding container 442a2fd30390ad933531da50072703bb5362ec14153b058ef92b8b7892d3d829: Status 404 returned error can't find the container with id 442a2fd30390ad933531da50072703bb5362ec14153b058ef92b8b7892d3d829 Mar 09 09:26:28 crc kubenswrapper[4792]: I0309 09:26:28.917436 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-97cb-account-create-update-kfnzz" event={"ID":"93a03de8-9516-4678-aafa-e4ce0b5c6e2b","Type":"ContainerStarted","Data":"442a2fd30390ad933531da50072703bb5362ec14153b058ef92b8b7892d3d829"} Mar 09 09:26:28 crc kubenswrapper[4792]: I0309 09:26:28.919049 4792 generic.go:334] "Generic (PLEG): container finished" podID="dda26326-39c5-4d44-9b41-7b9cf206b4ac" containerID="1ce2f0989d353fbf89243b76a683bd93a6057b364bb790ff3c931ade1fc5a8c5" exitCode=0 Mar 09 09:26:28 crc kubenswrapper[4792]: I0309 09:26:28.919121 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hwgng" event={"ID":"dda26326-39c5-4d44-9b41-7b9cf206b4ac","Type":"ContainerDied","Data":"1ce2f0989d353fbf89243b76a683bd93a6057b364bb790ff3c931ade1fc5a8c5"} Mar 09 09:26:28 crc kubenswrapper[4792]: I0309 09:26:28.920511 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9w295" event={"ID":"ff5b2f92-52e4-4f31-a3a5-75f94a5cac77","Type":"ContainerStarted","Data":"d62e4ff612f53f3fa16bf79a98a9384dfea17eab4bf71dd9c92965b2dfffb01a"} Mar 09 09:26:28 crc kubenswrapper[4792]: I0309 09:26:28.923144 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a92c-account-create-update-5547j" event={"ID":"ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c","Type":"ContainerStarted","Data":"7ae5c0352e1ac16233ac8e3e72c996410d92c1437a5d891406e5519144546f9a"} Mar 09 09:26:28 crc kubenswrapper[4792]: I0309 09:26:28.924384 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-267hg" event={"ID":"3dfbc031-90cc-42ac-b543-febe1469f699","Type":"ContainerStarted","Data":"b4366093e3b0ca77425eb386f91854bc163a13f323eed682f2cc36a9107bb7ca"} Mar 09 09:26:29 crc kubenswrapper[4792]: I0309 09:26:29.935787 4792 generic.go:334] "Generic (PLEG): container finished" podID="21c7264c-a882-4413-8e6a-c42ca77b7153" containerID="c92ddaa7c82d4d86f3f797b790568d0792df8e9869f6b5a008140042de6024ca" exitCode=0 Mar 09 09:26:29 crc kubenswrapper[4792]: I0309 09:26:29.935863 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kcl6t" event={"ID":"21c7264c-a882-4413-8e6a-c42ca77b7153","Type":"ContainerDied","Data":"c92ddaa7c82d4d86f3f797b790568d0792df8e9869f6b5a008140042de6024ca"} Mar 09 09:26:29 crc kubenswrapper[4792]: I0309 09:26:29.938583 4792 generic.go:334] "Generic (PLEG): container finished" podID="ff5b2f92-52e4-4f31-a3a5-75f94a5cac77" containerID="3979e6dbc46490911e92e4ca52d70831161c415e9a0d2b8d7030d57ba6b3b7b5" exitCode=0 Mar 09 09:26:29 crc kubenswrapper[4792]: I0309 09:26:29.938700 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9w295" event={"ID":"ff5b2f92-52e4-4f31-a3a5-75f94a5cac77","Type":"ContainerDied","Data":"3979e6dbc46490911e92e4ca52d70831161c415e9a0d2b8d7030d57ba6b3b7b5"} Mar 09 09:26:29 crc kubenswrapper[4792]: I0309 09:26:29.940677 4792 generic.go:334] "Generic (PLEG): container finished" podID="ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c" containerID="7cd96cf84084a79b039ba22f8c9a2a17dfae2180410aa03d3bb80fc4d555f917" exitCode=0 Mar 09 09:26:29 crc kubenswrapper[4792]: I0309 09:26:29.940740 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a92c-account-create-update-5547j" event={"ID":"ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c","Type":"ContainerDied","Data":"7cd96cf84084a79b039ba22f8c9a2a17dfae2180410aa03d3bb80fc4d555f917"} Mar 09 09:26:29 crc kubenswrapper[4792]: I0309 09:26:29.942045 4792 generic.go:334] "Generic (PLEG): container finished" podID="7ba4128e-9cd3-4ec9-a955-230ca12f4fed" containerID="b24252fb970fc808687181defaae9df9662e3593786a5d40acd86f1a852d7188" exitCode=0 Mar 09 09:26:29 crc kubenswrapper[4792]: I0309 09:26:29.942091 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8a45-account-create-update-hss2q" event={"ID":"7ba4128e-9cd3-4ec9-a955-230ca12f4fed","Type":"ContainerDied","Data":"b24252fb970fc808687181defaae9df9662e3593786a5d40acd86f1a852d7188"} Mar 09 09:26:29 crc kubenswrapper[4792]: I0309 09:26:29.943682 4792 generic.go:334] "Generic (PLEG): container finished" podID="93a03de8-9516-4678-aafa-e4ce0b5c6e2b" containerID="9b851087551136349181e1004ac0ab3cd58c966eb6ba358b55755d0b33558b72" exitCode=0 Mar 09 09:26:29 crc kubenswrapper[4792]: I0309 09:26:29.943777 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-97cb-account-create-update-kfnzz" event={"ID":"93a03de8-9516-4678-aafa-e4ce0b5c6e2b","Type":"ContainerDied","Data":"9b851087551136349181e1004ac0ab3cd58c966eb6ba358b55755d0b33558b72"} Mar 09 09:26:30 crc kubenswrapper[4792]: I0309 09:26:30.285155 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hwgng" Mar 09 09:26:30 crc kubenswrapper[4792]: I0309 09:26:30.465165 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dda26326-39c5-4d44-9b41-7b9cf206b4ac-operator-scripts\") pod \"dda26326-39c5-4d44-9b41-7b9cf206b4ac\" (UID: \"dda26326-39c5-4d44-9b41-7b9cf206b4ac\") " Mar 09 09:26:30 crc kubenswrapper[4792]: I0309 09:26:30.465243 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88rpv\" (UniqueName: \"kubernetes.io/projected/dda26326-39c5-4d44-9b41-7b9cf206b4ac-kube-api-access-88rpv\") pod \"dda26326-39c5-4d44-9b41-7b9cf206b4ac\" (UID: \"dda26326-39c5-4d44-9b41-7b9cf206b4ac\") " Mar 09 09:26:30 crc kubenswrapper[4792]: I0309 09:26:30.466287 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dda26326-39c5-4d44-9b41-7b9cf206b4ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dda26326-39c5-4d44-9b41-7b9cf206b4ac" (UID: "dda26326-39c5-4d44-9b41-7b9cf206b4ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:30 crc kubenswrapper[4792]: I0309 09:26:30.471880 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda26326-39c5-4d44-9b41-7b9cf206b4ac-kube-api-access-88rpv" (OuterVolumeSpecName: "kube-api-access-88rpv") pod "dda26326-39c5-4d44-9b41-7b9cf206b4ac" (UID: "dda26326-39c5-4d44-9b41-7b9cf206b4ac"). InnerVolumeSpecName "kube-api-access-88rpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:30 crc kubenswrapper[4792]: I0309 09:26:30.566826 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dda26326-39c5-4d44-9b41-7b9cf206b4ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:30 crc kubenswrapper[4792]: I0309 09:26:30.566865 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88rpv\" (UniqueName: \"kubernetes.io/projected/dda26326-39c5-4d44-9b41-7b9cf206b4ac-kube-api-access-88rpv\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:30 crc kubenswrapper[4792]: I0309 09:26:30.954828 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hwgng" Mar 09 09:26:30 crc kubenswrapper[4792]: I0309 09:26:30.955027 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hwgng" event={"ID":"dda26326-39c5-4d44-9b41-7b9cf206b4ac","Type":"ContainerDied","Data":"b03f240b39a51e0ccf48559a1f8497f5fe76a8395268ea78722d05154f86c624"} Mar 09 09:26:30 crc kubenswrapper[4792]: I0309 09:26:30.955329 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b03f240b39a51e0ccf48559a1f8497f5fe76a8395268ea78722d05154f86c624" Mar 09 09:26:32 crc kubenswrapper[4792]: I0309 09:26:32.653610 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:26:32 crc kubenswrapper[4792]: I0309 09:26:32.750944 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-wjgqq"] Mar 09 09:26:32 crc kubenswrapper[4792]: I0309 09:26:32.752986 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" podUID="c980fc97-3e6c-4f70-a130-b0a10bd053e3" containerName="dnsmasq-dns" containerID="cri-o://da89f09900708c8b929bf1df242e9501a75fbfd45831a3736a1c38d34d0dbdac" gracePeriod=10 Mar 09 09:26:32 crc kubenswrapper[4792]: I0309 09:26:32.994779 4792 generic.go:334] "Generic (PLEG): container finished" podID="c980fc97-3e6c-4f70-a130-b0a10bd053e3" containerID="da89f09900708c8b929bf1df242e9501a75fbfd45831a3736a1c38d34d0dbdac" exitCode=0 Mar 09 09:26:32 crc kubenswrapper[4792]: I0309 09:26:32.994824 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" event={"ID":"c980fc97-3e6c-4f70-a130-b0a10bd053e3","Type":"ContainerDied","Data":"da89f09900708c8b929bf1df242e9501a75fbfd45831a3736a1c38d34d0dbdac"} Mar 09 09:26:33 crc kubenswrapper[4792]: I0309 09:26:33.856738 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9w295" Mar 09 09:26:33 crc kubenswrapper[4792]: I0309 09:26:33.883218 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8a45-account-create-update-hss2q" Mar 09 09:26:33 crc kubenswrapper[4792]: I0309 09:26:33.943897 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngc2b\" (UniqueName: \"kubernetes.io/projected/7ba4128e-9cd3-4ec9-a955-230ca12f4fed-kube-api-access-ngc2b\") pod \"7ba4128e-9cd3-4ec9-a955-230ca12f4fed\" (UID: \"7ba4128e-9cd3-4ec9-a955-230ca12f4fed\") " Mar 09 09:26:33 crc kubenswrapper[4792]: I0309 09:26:33.943953 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mb28\" (UniqueName: \"kubernetes.io/projected/ff5b2f92-52e4-4f31-a3a5-75f94a5cac77-kube-api-access-2mb28\") pod \"ff5b2f92-52e4-4f31-a3a5-75f94a5cac77\" (UID: \"ff5b2f92-52e4-4f31-a3a5-75f94a5cac77\") " Mar 09 09:26:33 crc kubenswrapper[4792]: I0309 09:26:33.944013 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5b2f92-52e4-4f31-a3a5-75f94a5cac77-operator-scripts\") pod \"ff5b2f92-52e4-4f31-a3a5-75f94a5cac77\" (UID: \"ff5b2f92-52e4-4f31-a3a5-75f94a5cac77\") " Mar 09 09:26:33 crc kubenswrapper[4792]: I0309 09:26:33.944051 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba4128e-9cd3-4ec9-a955-230ca12f4fed-operator-scripts\") pod \"7ba4128e-9cd3-4ec9-a955-230ca12f4fed\" (UID: \"7ba4128e-9cd3-4ec9-a955-230ca12f4fed\") " Mar 09 09:26:33 crc kubenswrapper[4792]: I0309 09:26:33.944448 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a92c-account-create-update-5547j" Mar 09 09:26:33 crc kubenswrapper[4792]: I0309 09:26:33.944917 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ba4128e-9cd3-4ec9-a955-230ca12f4fed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ba4128e-9cd3-4ec9-a955-230ca12f4fed" (UID: "7ba4128e-9cd3-4ec9-a955-230ca12f4fed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:33 crc kubenswrapper[4792]: I0309 09:26:33.946336 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff5b2f92-52e4-4f31-a3a5-75f94a5cac77-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff5b2f92-52e4-4f31-a3a5-75f94a5cac77" (UID: "ff5b2f92-52e4-4f31-a3a5-75f94a5cac77"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:33 crc kubenswrapper[4792]: I0309 09:26:33.955454 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5b2f92-52e4-4f31-a3a5-75f94a5cac77-kube-api-access-2mb28" (OuterVolumeSpecName: "kube-api-access-2mb28") pod "ff5b2f92-52e4-4f31-a3a5-75f94a5cac77" (UID: "ff5b2f92-52e4-4f31-a3a5-75f94a5cac77"). InnerVolumeSpecName "kube-api-access-2mb28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:33 crc kubenswrapper[4792]: I0309 09:26:33.957690 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-97cb-account-create-update-kfnzz" Mar 09 09:26:33 crc kubenswrapper[4792]: I0309 09:26:33.958057 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kcl6t" Mar 09 09:26:33 crc kubenswrapper[4792]: I0309 09:26:33.963333 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ba4128e-9cd3-4ec9-a955-230ca12f4fed-kube-api-access-ngc2b" (OuterVolumeSpecName: "kube-api-access-ngc2b") pod "7ba4128e-9cd3-4ec9-a955-230ca12f4fed" (UID: "7ba4128e-9cd3-4ec9-a955-230ca12f4fed"). InnerVolumeSpecName "kube-api-access-ngc2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.008989 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.022348 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-kcl6t" event={"ID":"21c7264c-a882-4413-8e6a-c42ca77b7153","Type":"ContainerDied","Data":"2bc847c82d4e73c8f3d1d9ca1e352e77cb3b53f163d460ffc960848853569ffa"} Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.022396 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bc847c82d4e73c8f3d1d9ca1e352e77cb3b53f163d460ffc960848853569ffa" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.022464 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-kcl6t" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.024908 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-9w295" event={"ID":"ff5b2f92-52e4-4f31-a3a5-75f94a5cac77","Type":"ContainerDied","Data":"d62e4ff612f53f3fa16bf79a98a9384dfea17eab4bf71dd9c92965b2dfffb01a"} Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.024930 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d62e4ff612f53f3fa16bf79a98a9384dfea17eab4bf71dd9c92965b2dfffb01a" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.024961 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-9w295" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.031311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a92c-account-create-update-5547j" event={"ID":"ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c","Type":"ContainerDied","Data":"7ae5c0352e1ac16233ac8e3e72c996410d92c1437a5d891406e5519144546f9a"} Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.031359 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ae5c0352e1ac16233ac8e3e72c996410d92c1437a5d891406e5519144546f9a" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.031447 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a92c-account-create-update-5547j" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.047285 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-dns-svc\") pod \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.047406 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9dr8\" (UniqueName: \"kubernetes.io/projected/21c7264c-a882-4413-8e6a-c42ca77b7153-kube-api-access-b9dr8\") pod \"21c7264c-a882-4413-8e6a-c42ca77b7153\" (UID: \"21c7264c-a882-4413-8e6a-c42ca77b7153\") " Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.047450 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-ovsdbserver-nb\") pod \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.047508 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c-operator-scripts\") pod \"ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c\" (UID: \"ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c\") " Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.047536 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbdbc\" (UniqueName: \"kubernetes.io/projected/ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c-kube-api-access-nbdbc\") pod \"ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c\" (UID: \"ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c\") " Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.047564 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-config\") pod \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.047596 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n97\" (UniqueName: \"kubernetes.io/projected/c980fc97-3e6c-4f70-a130-b0a10bd053e3-kube-api-access-s4n97\") pod \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.047639 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93a03de8-9516-4678-aafa-e4ce0b5c6e2b-operator-scripts\") pod \"93a03de8-9516-4678-aafa-e4ce0b5c6e2b\" (UID: \"93a03de8-9516-4678-aafa-e4ce0b5c6e2b\") " Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.047681 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21c7264c-a882-4413-8e6a-c42ca77b7153-operator-scripts\") pod \"21c7264c-a882-4413-8e6a-c42ca77b7153\" (UID: \"21c7264c-a882-4413-8e6a-c42ca77b7153\") " Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.047707 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pc5w\" (UniqueName: \"kubernetes.io/projected/93a03de8-9516-4678-aafa-e4ce0b5c6e2b-kube-api-access-6pc5w\") pod \"93a03de8-9516-4678-aafa-e4ce0b5c6e2b\" (UID: \"93a03de8-9516-4678-aafa-e4ce0b5c6e2b\") " Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.047765 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-ovsdbserver-sb\") pod \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\" (UID: \"c980fc97-3e6c-4f70-a130-b0a10bd053e3\") " Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.048136 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngc2b\" (UniqueName: \"kubernetes.io/projected/7ba4128e-9cd3-4ec9-a955-230ca12f4fed-kube-api-access-ngc2b\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.048156 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mb28\" (UniqueName: \"kubernetes.io/projected/ff5b2f92-52e4-4f31-a3a5-75f94a5cac77-kube-api-access-2mb28\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.048168 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5b2f92-52e4-4f31-a3a5-75f94a5cac77-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.048181 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba4128e-9cd3-4ec9-a955-230ca12f4fed-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.051472 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8a45-account-create-update-hss2q" event={"ID":"7ba4128e-9cd3-4ec9-a955-230ca12f4fed","Type":"ContainerDied","Data":"c64a09d698d754dfcf6f00b6ad6b9e64c5395f5b974f71e4d457784130f66b99"} Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.051519 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c64a09d698d754dfcf6f00b6ad6b9e64c5395f5b974f71e4d457784130f66b99" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.051601 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8a45-account-create-update-hss2q" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.064970 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c" (UID: "ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.069906 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93a03de8-9516-4678-aafa-e4ce0b5c6e2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93a03de8-9516-4678-aafa-e4ce0b5c6e2b" (UID: "93a03de8-9516-4678-aafa-e4ce0b5c6e2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.072313 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21c7264c-a882-4413-8e6a-c42ca77b7153-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21c7264c-a882-4413-8e6a-c42ca77b7153" (UID: "21c7264c-a882-4413-8e6a-c42ca77b7153"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.072326 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93a03de8-9516-4678-aafa-e4ce0b5c6e2b-kube-api-access-6pc5w" (OuterVolumeSpecName: "kube-api-access-6pc5w") pod "93a03de8-9516-4678-aafa-e4ce0b5c6e2b" (UID: "93a03de8-9516-4678-aafa-e4ce0b5c6e2b"). InnerVolumeSpecName "kube-api-access-6pc5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.075758 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" event={"ID":"c980fc97-3e6c-4f70-a130-b0a10bd053e3","Type":"ContainerDied","Data":"8f0d7cadd3ba6560fceb3ad98ce9bf3b3172395fe8af056c31795972c5332ce5"} Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.075802 4792 scope.go:117] "RemoveContainer" containerID="da89f09900708c8b929bf1df242e9501a75fbfd45831a3736a1c38d34d0dbdac" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.075940 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-wjgqq" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.082414 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-97cb-account-create-update-kfnzz" event={"ID":"93a03de8-9516-4678-aafa-e4ce0b5c6e2b","Type":"ContainerDied","Data":"442a2fd30390ad933531da50072703bb5362ec14153b058ef92b8b7892d3d829"} Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.082621 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="442a2fd30390ad933531da50072703bb5362ec14153b058ef92b8b7892d3d829" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.082679 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-97cb-account-create-update-kfnzz" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.088138 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c-kube-api-access-nbdbc" (OuterVolumeSpecName: "kube-api-access-nbdbc") pod "ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c" (UID: "ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c"). InnerVolumeSpecName "kube-api-access-nbdbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.088297 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c980fc97-3e6c-4f70-a130-b0a10bd053e3-kube-api-access-s4n97" (OuterVolumeSpecName: "kube-api-access-s4n97") pod "c980fc97-3e6c-4f70-a130-b0a10bd053e3" (UID: "c980fc97-3e6c-4f70-a130-b0a10bd053e3"). InnerVolumeSpecName "kube-api-access-s4n97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.114836 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c7264c-a882-4413-8e6a-c42ca77b7153-kube-api-access-b9dr8" (OuterVolumeSpecName: "kube-api-access-b9dr8") pod "21c7264c-a882-4413-8e6a-c42ca77b7153" (UID: "21c7264c-a882-4413-8e6a-c42ca77b7153"). InnerVolumeSpecName "kube-api-access-b9dr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.123859 4792 scope.go:117] "RemoveContainer" containerID="1a5c0513666f40950180974ad50af5b21979a98a35b4c1230d956eb267c0232e" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.143561 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c980fc97-3e6c-4f70-a130-b0a10bd053e3" (UID: "c980fc97-3e6c-4f70-a130-b0a10bd053e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.145975 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c980fc97-3e6c-4f70-a130-b0a10bd053e3" (UID: "c980fc97-3e6c-4f70-a130-b0a10bd053e3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.146303 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c980fc97-3e6c-4f70-a130-b0a10bd053e3" (UID: "c980fc97-3e6c-4f70-a130-b0a10bd053e3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.150509 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.150536 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9dr8\" (UniqueName: \"kubernetes.io/projected/21c7264c-a882-4413-8e6a-c42ca77b7153-kube-api-access-b9dr8\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.150549 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.150559 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.150567 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbdbc\" (UniqueName: \"kubernetes.io/projected/ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c-kube-api-access-nbdbc\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.150577 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n97\" (UniqueName: \"kubernetes.io/projected/c980fc97-3e6c-4f70-a130-b0a10bd053e3-kube-api-access-s4n97\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.150585 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93a03de8-9516-4678-aafa-e4ce0b5c6e2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.150593 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21c7264c-a882-4413-8e6a-c42ca77b7153-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.150601 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pc5w\" (UniqueName: \"kubernetes.io/projected/93a03de8-9516-4678-aafa-e4ce0b5c6e2b-kube-api-access-6pc5w\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.150598 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-config" (OuterVolumeSpecName: "config") pod "c980fc97-3e6c-4f70-a130-b0a10bd053e3" (UID: "c980fc97-3e6c-4f70-a130-b0a10bd053e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.150609 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.252061 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c980fc97-3e6c-4f70-a130-b0a10bd053e3-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.406029 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-wjgqq"] Mar 09 09:26:34 crc kubenswrapper[4792]: I0309 09:26:34.414399 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-wjgqq"] Mar 09 09:26:35 crc kubenswrapper[4792]: I0309 09:26:35.092161 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-267hg" event={"ID":"3dfbc031-90cc-42ac-b543-febe1469f699","Type":"ContainerStarted","Data":"77d22be3d55c6ddd33e6d3b099584d93b83a1675c5c974830e1becbc6abb866e"} Mar 09 09:26:35 crc kubenswrapper[4792]: I0309 09:26:35.113603 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-267hg" podStartSLOduration=3.608840099 podStartE2EDuration="9.113585003s" podCreationTimestamp="2026-03-09 09:26:26 +0000 UTC" firstStartedPulling="2026-03-09 09:26:28.207470354 +0000 UTC m=+1153.237671106" lastFinishedPulling="2026-03-09 09:26:33.712215258 +0000 UTC m=+1158.742416010" observedRunningTime="2026-03-09 09:26:35.109315889 +0000 UTC m=+1160.139516641" watchObservedRunningTime="2026-03-09 09:26:35.113585003 +0000 UTC m=+1160.143785755" Mar 09 09:26:35 crc kubenswrapper[4792]: I0309 09:26:35.671892 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c980fc97-3e6c-4f70-a130-b0a10bd053e3" path="/var/lib/kubelet/pods/c980fc97-3e6c-4f70-a130-b0a10bd053e3/volumes" Mar 09 09:26:38 crc kubenswrapper[4792]: I0309 09:26:38.120037 4792 generic.go:334] "Generic (PLEG): container finished" podID="3dfbc031-90cc-42ac-b543-febe1469f699" containerID="77d22be3d55c6ddd33e6d3b099584d93b83a1675c5c974830e1becbc6abb866e" exitCode=0 Mar 09 09:26:38 crc kubenswrapper[4792]: I0309 09:26:38.120374 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-267hg" event={"ID":"3dfbc031-90cc-42ac-b543-febe1469f699","Type":"ContainerDied","Data":"77d22be3d55c6ddd33e6d3b099584d93b83a1675c5c974830e1becbc6abb866e"} Mar 09 09:26:39 crc kubenswrapper[4792]: I0309 09:26:39.464249 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-267hg" Mar 09 09:26:39 crc kubenswrapper[4792]: I0309 09:26:39.636232 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvznd\" (UniqueName: \"kubernetes.io/projected/3dfbc031-90cc-42ac-b543-febe1469f699-kube-api-access-bvznd\") pod \"3dfbc031-90cc-42ac-b543-febe1469f699\" (UID: \"3dfbc031-90cc-42ac-b543-febe1469f699\") " Mar 09 09:26:39 crc kubenswrapper[4792]: I0309 09:26:39.636335 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dfbc031-90cc-42ac-b543-febe1469f699-config-data\") pod \"3dfbc031-90cc-42ac-b543-febe1469f699\" (UID: \"3dfbc031-90cc-42ac-b543-febe1469f699\") " Mar 09 09:26:39 crc kubenswrapper[4792]: I0309 09:26:39.636401 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dfbc031-90cc-42ac-b543-febe1469f699-combined-ca-bundle\") pod \"3dfbc031-90cc-42ac-b543-febe1469f699\" (UID: \"3dfbc031-90cc-42ac-b543-febe1469f699\") " Mar 09 09:26:39 crc kubenswrapper[4792]: I0309 09:26:39.641957 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dfbc031-90cc-42ac-b543-febe1469f699-kube-api-access-bvznd" (OuterVolumeSpecName: "kube-api-access-bvznd") pod "3dfbc031-90cc-42ac-b543-febe1469f699" (UID: "3dfbc031-90cc-42ac-b543-febe1469f699"). InnerVolumeSpecName "kube-api-access-bvznd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:39 crc kubenswrapper[4792]: I0309 09:26:39.670209 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dfbc031-90cc-42ac-b543-febe1469f699-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dfbc031-90cc-42ac-b543-febe1469f699" (UID: "3dfbc031-90cc-42ac-b543-febe1469f699"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:39 crc kubenswrapper[4792]: I0309 09:26:39.679947 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dfbc031-90cc-42ac-b543-febe1469f699-config-data" (OuterVolumeSpecName: "config-data") pod "3dfbc031-90cc-42ac-b543-febe1469f699" (UID: "3dfbc031-90cc-42ac-b543-febe1469f699"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:39 crc kubenswrapper[4792]: I0309 09:26:39.737834 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dfbc031-90cc-42ac-b543-febe1469f699-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:39 crc kubenswrapper[4792]: I0309 09:26:39.737866 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dfbc031-90cc-42ac-b543-febe1469f699-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:39 crc kubenswrapper[4792]: I0309 09:26:39.737876 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvznd\" (UniqueName: \"kubernetes.io/projected/3dfbc031-90cc-42ac-b543-febe1469f699-kube-api-access-bvznd\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.136215 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-267hg" event={"ID":"3dfbc031-90cc-42ac-b543-febe1469f699","Type":"ContainerDied","Data":"b4366093e3b0ca77425eb386f91854bc163a13f323eed682f2cc36a9107bb7ca"} Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.136265 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4366093e3b0ca77425eb386f91854bc163a13f323eed682f2cc36a9107bb7ca" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.136297 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-267hg" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.425415 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7796644bc-cqms6"] Mar 09 09:26:40 crc kubenswrapper[4792]: E0309 09:26:40.426053 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c980fc97-3e6c-4f70-a130-b0a10bd053e3" containerName="dnsmasq-dns" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426123 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c980fc97-3e6c-4f70-a130-b0a10bd053e3" containerName="dnsmasq-dns" Mar 09 09:26:40 crc kubenswrapper[4792]: E0309 09:26:40.426142 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c7264c-a882-4413-8e6a-c42ca77b7153" containerName="mariadb-database-create" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426149 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c7264c-a882-4413-8e6a-c42ca77b7153" containerName="mariadb-database-create" Mar 09 09:26:40 crc kubenswrapper[4792]: E0309 09:26:40.426157 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c" containerName="mariadb-account-create-update" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426164 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c" containerName="mariadb-account-create-update" Mar 09 09:26:40 crc kubenswrapper[4792]: E0309 09:26:40.426179 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda26326-39c5-4d44-9b41-7b9cf206b4ac" containerName="mariadb-database-create" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426186 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda26326-39c5-4d44-9b41-7b9cf206b4ac" containerName="mariadb-database-create" Mar 09 09:26:40 crc kubenswrapper[4792]: E0309 09:26:40.426208 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5b2f92-52e4-4f31-a3a5-75f94a5cac77" containerName="mariadb-database-create" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426214 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5b2f92-52e4-4f31-a3a5-75f94a5cac77" containerName="mariadb-database-create" Mar 09 09:26:40 crc kubenswrapper[4792]: E0309 09:26:40.426222 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dfbc031-90cc-42ac-b543-febe1469f699" containerName="keystone-db-sync" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426229 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dfbc031-90cc-42ac-b543-febe1469f699" containerName="keystone-db-sync" Mar 09 09:26:40 crc kubenswrapper[4792]: E0309 09:26:40.426244 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a03de8-9516-4678-aafa-e4ce0b5c6e2b" containerName="mariadb-account-create-update" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426252 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a03de8-9516-4678-aafa-e4ce0b5c6e2b" containerName="mariadb-account-create-update" Mar 09 09:26:40 crc kubenswrapper[4792]: E0309 09:26:40.426266 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c980fc97-3e6c-4f70-a130-b0a10bd053e3" containerName="init" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426272 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c980fc97-3e6c-4f70-a130-b0a10bd053e3" containerName="init" Mar 09 09:26:40 crc kubenswrapper[4792]: E0309 09:26:40.426290 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba4128e-9cd3-4ec9-a955-230ca12f4fed" containerName="mariadb-account-create-update" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426297 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba4128e-9cd3-4ec9-a955-230ca12f4fed" containerName="mariadb-account-create-update" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426447 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ba4128e-9cd3-4ec9-a955-230ca12f4fed" containerName="mariadb-account-create-update" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426458 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda26326-39c5-4d44-9b41-7b9cf206b4ac" containerName="mariadb-database-create" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426469 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c980fc97-3e6c-4f70-a130-b0a10bd053e3" containerName="dnsmasq-dns" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426480 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5b2f92-52e4-4f31-a3a5-75f94a5cac77" containerName="mariadb-database-create" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426493 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a03de8-9516-4678-aafa-e4ce0b5c6e2b" containerName="mariadb-account-create-update" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426507 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c7264c-a882-4413-8e6a-c42ca77b7153" containerName="mariadb-database-create" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426516 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c" containerName="mariadb-account-create-update" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.426524 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dfbc031-90cc-42ac-b543-febe1469f699" containerName="keystone-db-sync" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.427454 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.453517 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kkglm"] Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.454588 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.462133 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-cqms6"] Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.464426 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.464639 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.464747 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.464872 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cgfmm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.464983 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.503047 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kkglm"] Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.549687 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-config\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.549862 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-ovsdbserver-sb\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.549912 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tckj8\" (UniqueName: \"kubernetes.io/projected/75224229-25dc-4546-a8b1-9eb3185a8d09-kube-api-access-tckj8\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.550051 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-ovsdbserver-nb\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.550376 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-dns-svc\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.651882 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-dns-svc\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.651932 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-config\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.651979 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-config-data\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.652011 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-ovsdbserver-sb\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.652036 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tckj8\" (UniqueName: \"kubernetes.io/projected/75224229-25dc-4546-a8b1-9eb3185a8d09-kube-api-access-tckj8\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.652058 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-combined-ca-bundle\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.652126 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-credential-keys\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.652151 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-ovsdbserver-nb\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.652175 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-scripts\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.652222 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-fernet-keys\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.652246 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hncvt\" (UniqueName: \"kubernetes.io/projected/801f33bc-ee66-4276-baf0-d374120ed464-kube-api-access-hncvt\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.654386 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-ovsdbserver-nb\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.654418 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-config\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.654417 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-ovsdbserver-sb\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.654665 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-dns-svc\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.688926 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tckj8\" (UniqueName: \"kubernetes.io/projected/75224229-25dc-4546-a8b1-9eb3185a8d09-kube-api-access-tckj8\") pod \"dnsmasq-dns-7796644bc-cqms6\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.750752 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.753518 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-config-data\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.753590 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-combined-ca-bundle\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.753617 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-credential-keys\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.753642 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-scripts\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.753709 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-fernet-keys\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.753754 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hncvt\" (UniqueName: \"kubernetes.io/projected/801f33bc-ee66-4276-baf0-d374120ed464-kube-api-access-hncvt\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.757163 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-scripts\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.760143 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-fernet-keys\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.760948 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-combined-ca-bundle\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.763761 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-credential-keys\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.764026 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-config-data\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.843739 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hncvt\" (UniqueName: \"kubernetes.io/projected/801f33bc-ee66-4276-baf0-d374120ed464-kube-api-access-hncvt\") pod \"keystone-bootstrap-kkglm\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.856943 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.860869 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.876585 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.902496 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.925126 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-gkkjx"] Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.926199 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.930691 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.930913 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-spxwj" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.931051 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 09:26:40 crc kubenswrapper[4792]: I0309 09:26:40.939350 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.052347 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gkkjx"] Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.059536 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-db-sync-config-data\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.059615 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-combined-ca-bundle\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.059636 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.059661 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-config-data\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.059684 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee33f2-4999-4736-965e-3f5eae090e14-log-httpd\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.059724 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee33f2-4999-4736-965e-3f5eae090e14-run-httpd\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.059765 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc6wl\" (UniqueName: \"kubernetes.io/projected/6f4bcabb-7f34-423b-a653-bd785eba0978-kube-api-access-qc6wl\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.059804 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-scripts\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.059820 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99zb2\" (UniqueName: \"kubernetes.io/projected/9cee33f2-4999-4736-965e-3f5eae090e14-kube-api-access-99zb2\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.059839 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-scripts\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.059860 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f4bcabb-7f34-423b-a653-bd785eba0978-etc-machine-id\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.059883 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-config-data\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.059905 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.068349 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-k9lg4"] Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.082460 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k9lg4" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.086521 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.092024 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.092199 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-snjvg" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.092205 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.165844 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-combined-ca-bundle\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.165908 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.175688 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-k9lg4"] Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.177219 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-config-data\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.177316 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee33f2-4999-4736-965e-3f5eae090e14-log-httpd\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.177380 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee33f2-4999-4736-965e-3f5eae090e14-run-httpd\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.177414 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc6wl\" (UniqueName: \"kubernetes.io/projected/6f4bcabb-7f34-423b-a653-bd785eba0978-kube-api-access-qc6wl\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.177508 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-scripts\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.177542 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99zb2\" (UniqueName: \"kubernetes.io/projected/9cee33f2-4999-4736-965e-3f5eae090e14-kube-api-access-99zb2\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.177585 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-scripts\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.177633 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f4bcabb-7f34-423b-a653-bd785eba0978-etc-machine-id\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.177664 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-config-data\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.177700 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.200763 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-db-sync-config-data\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.202824 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee33f2-4999-4736-965e-3f5eae090e14-log-httpd\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.204707 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee33f2-4999-4736-965e-3f5eae090e14-run-httpd\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.205521 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f4bcabb-7f34-423b-a653-bd785eba0978-etc-machine-id\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.208334 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-config-data\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.215742 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-scripts\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.216519 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.217084 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-scripts\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.217323 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-config-data\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.220033 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-db-sync-config-data\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.226236 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-combined-ca-bundle\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.229517 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.270013 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99zb2\" (UniqueName: \"kubernetes.io/projected/9cee33f2-4999-4736-965e-3f5eae090e14-kube-api-access-99zb2\") pod \"ceilometer-0\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.270450 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc6wl\" (UniqueName: \"kubernetes.io/projected/6f4bcabb-7f34-423b-a653-bd785eba0978-kube-api-access-qc6wl\") pod \"cinder-db-sync-gkkjx\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.289188 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-cqms6"] Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.304708 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7x8v\" (UniqueName: \"kubernetes.io/projected/bf438875-5be3-489e-8626-da673a088bef-kube-api-access-n7x8v\") pod \"neutron-db-sync-k9lg4\" (UID: \"bf438875-5be3-489e-8626-da673a088bef\") " pod="openstack/neutron-db-sync-k9lg4" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.304798 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf438875-5be3-489e-8626-da673a088bef-config\") pod \"neutron-db-sync-k9lg4\" (UID: \"bf438875-5be3-489e-8626-da673a088bef\") " pod="openstack/neutron-db-sync-k9lg4" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.304863 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf438875-5be3-489e-8626-da673a088bef-combined-ca-bundle\") pod \"neutron-db-sync-k9lg4\" (UID: \"bf438875-5be3-489e-8626-da673a088bef\") " pod="openstack/neutron-db-sync-k9lg4" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.308387 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xxwfp"] Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.309519 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xxwfp" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.313667 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.323496 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5vsw4" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.348678 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-rwg59"] Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.349837 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.357720 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.358162 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zfs4f" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.358375 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.366518 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xxwfp"] Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.380205 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rwg59"] Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.397588 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589bbb667-cpwzn"] Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.398898 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.411968 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf438875-5be3-489e-8626-da673a088bef-combined-ca-bundle\") pod \"neutron-db-sync-k9lg4\" (UID: \"bf438875-5be3-489e-8626-da673a088bef\") " pod="openstack/neutron-db-sync-k9lg4" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.412150 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7x8v\" (UniqueName: \"kubernetes.io/projected/bf438875-5be3-489e-8626-da673a088bef-kube-api-access-n7x8v\") pod \"neutron-db-sync-k9lg4\" (UID: \"bf438875-5be3-489e-8626-da673a088bef\") " pod="openstack/neutron-db-sync-k9lg4" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.412208 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf438875-5be3-489e-8626-da673a088bef-config\") pod \"neutron-db-sync-k9lg4\" (UID: \"bf438875-5be3-489e-8626-da673a088bef\") " pod="openstack/neutron-db-sync-k9lg4" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.419234 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-cpwzn"] Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.419585 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf438875-5be3-489e-8626-da673a088bef-combined-ca-bundle\") pod \"neutron-db-sync-k9lg4\" (UID: \"bf438875-5be3-489e-8626-da673a088bef\") " pod="openstack/neutron-db-sync-k9lg4" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.426530 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf438875-5be3-489e-8626-da673a088bef-config\") pod \"neutron-db-sync-k9lg4\" (UID: \"bf438875-5be3-489e-8626-da673a088bef\") " pod="openstack/neutron-db-sync-k9lg4" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.454530 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7x8v\" (UniqueName: \"kubernetes.io/projected/bf438875-5be3-489e-8626-da673a088bef-kube-api-access-n7x8v\") pod \"neutron-db-sync-k9lg4\" (UID: \"bf438875-5be3-489e-8626-da673a088bef\") " pod="openstack/neutron-db-sync-k9lg4" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.492917 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.518415 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-combined-ca-bundle\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.518474 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-ovsdbserver-nb\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.518516 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4bcq\" (UniqueName: \"kubernetes.io/projected/77fab42c-43ad-48e7-bbe0-d3698e3aea96-kube-api-access-f4bcq\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.518546 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rh64\" (UniqueName: \"kubernetes.io/projected/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-kube-api-access-9rh64\") pod \"barbican-db-sync-xxwfp\" (UID: \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\") " pod="openstack/barbican-db-sync-xxwfp" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.518582 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-config\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.518622 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-config-data\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.518690 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-ovsdbserver-sb\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.518721 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fab42c-43ad-48e7-bbe0-d3698e3aea96-logs\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.518753 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-db-sync-config-data\") pod \"barbican-db-sync-xxwfp\" (UID: \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\") " pod="openstack/barbican-db-sync-xxwfp" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.518784 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-combined-ca-bundle\") pod \"barbican-db-sync-xxwfp\" (UID: \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\") " pod="openstack/barbican-db-sync-xxwfp" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.518821 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n542r\" (UniqueName: \"kubernetes.io/projected/27b008e2-cac2-436f-9a6e-9e6415743f97-kube-api-access-n542r\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.518861 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-dns-svc\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.518902 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-scripts\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.531304 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k9lg4" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.567135 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.620902 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-combined-ca-bundle\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.621274 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-ovsdbserver-nb\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.621304 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4bcq\" (UniqueName: \"kubernetes.io/projected/77fab42c-43ad-48e7-bbe0-d3698e3aea96-kube-api-access-f4bcq\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.621324 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rh64\" (UniqueName: \"kubernetes.io/projected/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-kube-api-access-9rh64\") pod \"barbican-db-sync-xxwfp\" (UID: \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\") " pod="openstack/barbican-db-sync-xxwfp" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.621346 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-config\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.621372 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-config-data\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.621398 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-ovsdbserver-sb\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.621419 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fab42c-43ad-48e7-bbe0-d3698e3aea96-logs\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.621445 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-db-sync-config-data\") pod \"barbican-db-sync-xxwfp\" (UID: \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\") " pod="openstack/barbican-db-sync-xxwfp" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.621466 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-combined-ca-bundle\") pod \"barbican-db-sync-xxwfp\" (UID: \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\") " pod="openstack/barbican-db-sync-xxwfp" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.621489 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n542r\" (UniqueName: \"kubernetes.io/projected/27b008e2-cac2-436f-9a6e-9e6415743f97-kube-api-access-n542r\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.621511 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-dns-svc\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.621540 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-scripts\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.622918 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-ovsdbserver-sb\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.623599 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-config\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.624063 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-ovsdbserver-nb\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.624763 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-dns-svc\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.625264 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fab42c-43ad-48e7-bbe0-d3698e3aea96-logs\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.632913 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-scripts\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.633364 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-db-sync-config-data\") pod \"barbican-db-sync-xxwfp\" (UID: \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\") " pod="openstack/barbican-db-sync-xxwfp" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.633626 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-config-data\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.643970 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4bcq\" (UniqueName: \"kubernetes.io/projected/77fab42c-43ad-48e7-bbe0-d3698e3aea96-kube-api-access-f4bcq\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.644645 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-combined-ca-bundle\") pod \"barbican-db-sync-xxwfp\" (UID: \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\") " pod="openstack/barbican-db-sync-xxwfp" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.644657 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n542r\" (UniqueName: \"kubernetes.io/projected/27b008e2-cac2-436f-9a6e-9e6415743f97-kube-api-access-n542r\") pod \"dnsmasq-dns-589bbb667-cpwzn\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.648703 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-combined-ca-bundle\") pod \"placement-db-sync-rwg59\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.651429 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rh64\" (UniqueName: \"kubernetes.io/projected/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-kube-api-access-9rh64\") pod \"barbican-db-sync-xxwfp\" (UID: \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\") " pod="openstack/barbican-db-sync-xxwfp" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.692738 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rwg59" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.736793 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.748672 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-cqms6"] Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.942332 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xxwfp" Mar 09 09:26:41 crc kubenswrapper[4792]: I0309 09:26:41.944936 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kkglm"] Mar 09 09:26:42 crc kubenswrapper[4792]: I0309 09:26:42.265518 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7796644bc-cqms6" event={"ID":"75224229-25dc-4546-a8b1-9eb3185a8d09","Type":"ContainerStarted","Data":"4a1f6846c0acc74a7edf9be124ba07ceb68c597d7904102b24fe04ff96f822db"} Mar 09 09:26:42 crc kubenswrapper[4792]: I0309 09:26:42.303788 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kkglm" event={"ID":"801f33bc-ee66-4276-baf0-d374120ed464","Type":"ContainerStarted","Data":"eeda2d82067191b3ea5bdd7b60efb947b56ed2348ea33cdc9b72b7bf629b93b9"} Mar 09 09:26:42 crc kubenswrapper[4792]: I0309 09:26:42.322136 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-k9lg4"] Mar 09 09:26:42 crc kubenswrapper[4792]: I0309 09:26:42.349402 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:42 crc kubenswrapper[4792]: I0309 09:26:42.413160 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:26:42 crc kubenswrapper[4792]: I0309 09:26:42.677155 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-cpwzn"] Mar 09 09:26:42 crc kubenswrapper[4792]: I0309 09:26:42.788342 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rwg59"] Mar 09 09:26:42 crc kubenswrapper[4792]: I0309 09:26:42.837037 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gkkjx"] Mar 09 09:26:42 crc kubenswrapper[4792]: I0309 09:26:42.989972 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xxwfp"] Mar 09 09:26:43 crc kubenswrapper[4792]: W0309 09:26:43.017967 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0e4182c_e32b_443d_85ea_0e5737c3fd1e.slice/crio-ea95dfc4a96c29327dd0df03fa1af617d2384dc8d6c39354c570a4138a4a19d0 WatchSource:0}: Error finding container ea95dfc4a96c29327dd0df03fa1af617d2384dc8d6c39354c570a4138a4a19d0: Status 404 returned error can't find the container with id ea95dfc4a96c29327dd0df03fa1af617d2384dc8d6c39354c570a4138a4a19d0 Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.214539 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.214623 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.290988 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.315831 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee33f2-4999-4736-965e-3f5eae090e14","Type":"ContainerStarted","Data":"a7d0b6214e676557bf8cda17343c9dc5066cea995ab1e7a782015bb17a3d782d"} Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.324159 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xxwfp" event={"ID":"d0e4182c-e32b-443d-85ea-0e5737c3fd1e","Type":"ContainerStarted","Data":"ea95dfc4a96c29327dd0df03fa1af617d2384dc8d6c39354c570a4138a4a19d0"} Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.328321 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rwg59" event={"ID":"77fab42c-43ad-48e7-bbe0-d3698e3aea96","Type":"ContainerStarted","Data":"a1e5ccc03d6d9c630475dab94327b1be2f378f42924d41eeca616379af7e34ee"} Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.331253 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k9lg4" event={"ID":"bf438875-5be3-489e-8626-da673a088bef","Type":"ContainerStarted","Data":"61a61a611bc9aa8ff5b2a1f63fad4a53679b2714420f1ba9bcab5a31648848a4"} Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.331311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k9lg4" event={"ID":"bf438875-5be3-489e-8626-da673a088bef","Type":"ContainerStarted","Data":"df691a7c6767be2f0dd8dd63a690938bd797e87c1000d051688568f23aa09129"} Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.340481 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kkglm" event={"ID":"801f33bc-ee66-4276-baf0-d374120ed464","Type":"ContainerStarted","Data":"ac6c9779f534b4a3049af4bbb6f14ce0c56b7f97c973afb2a2e643eeefe7c69d"} Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.343231 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gkkjx" event={"ID":"6f4bcabb-7f34-423b-a653-bd785eba0978","Type":"ContainerStarted","Data":"98804b54ab5b27f2e5c42cbb7b9294079232fb0e3a93fa03dcc99decffd5149e"} Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.345546 4792 generic.go:334] "Generic (PLEG): container finished" podID="27b008e2-cac2-436f-9a6e-9e6415743f97" containerID="226c2c0d091990acf5a05d9f32bbe23603796d24fcbbc14e29dc58fbea2d5407" exitCode=0 Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.345624 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-cpwzn" event={"ID":"27b008e2-cac2-436f-9a6e-9e6415743f97","Type":"ContainerDied","Data":"226c2c0d091990acf5a05d9f32bbe23603796d24fcbbc14e29dc58fbea2d5407"} Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.345647 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-cpwzn" event={"ID":"27b008e2-cac2-436f-9a6e-9e6415743f97","Type":"ContainerStarted","Data":"10db91051169a792b3ade40e82bdd70fcec59a477c2266c5beae12a084c1fcef"} Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.356377 4792 generic.go:334] "Generic (PLEG): container finished" podID="75224229-25dc-4546-a8b1-9eb3185a8d09" containerID="82013f97b51b8a9a94893c4f4559d31b5b17dde8119f9a6aebf09ca2eba0e35a" exitCode=0 Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.356577 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7796644bc-cqms6" event={"ID":"75224229-25dc-4546-a8b1-9eb3185a8d09","Type":"ContainerDied","Data":"82013f97b51b8a9a94893c4f4559d31b5b17dde8119f9a6aebf09ca2eba0e35a"} Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.380900 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-k9lg4" podStartSLOduration=3.380878569 podStartE2EDuration="3.380878569s" podCreationTimestamp="2026-03-09 09:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:26:43.355403959 +0000 UTC m=+1168.385604711" watchObservedRunningTime="2026-03-09 09:26:43.380878569 +0000 UTC m=+1168.411079321" Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.424450 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kkglm" podStartSLOduration=3.424420133 podStartE2EDuration="3.424420133s" podCreationTimestamp="2026-03-09 09:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:26:43.419054617 +0000 UTC m=+1168.449255369" watchObservedRunningTime="2026-03-09 09:26:43.424420133 +0000 UTC m=+1168.454620885" Mar 09 09:26:43 crc kubenswrapper[4792]: I0309 09:26:43.937630 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.011510 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-ovsdbserver-nb\") pod \"75224229-25dc-4546-a8b1-9eb3185a8d09\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.011572 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-dns-svc\") pod \"75224229-25dc-4546-a8b1-9eb3185a8d09\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.011734 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tckj8\" (UniqueName: \"kubernetes.io/projected/75224229-25dc-4546-a8b1-9eb3185a8d09-kube-api-access-tckj8\") pod \"75224229-25dc-4546-a8b1-9eb3185a8d09\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.011797 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-config\") pod \"75224229-25dc-4546-a8b1-9eb3185a8d09\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.011861 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-ovsdbserver-sb\") pod \"75224229-25dc-4546-a8b1-9eb3185a8d09\" (UID: \"75224229-25dc-4546-a8b1-9eb3185a8d09\") " Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.026358 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75224229-25dc-4546-a8b1-9eb3185a8d09-kube-api-access-tckj8" (OuterVolumeSpecName: "kube-api-access-tckj8") pod "75224229-25dc-4546-a8b1-9eb3185a8d09" (UID: "75224229-25dc-4546-a8b1-9eb3185a8d09"). InnerVolumeSpecName "kube-api-access-tckj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.057929 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-config" (OuterVolumeSpecName: "config") pod "75224229-25dc-4546-a8b1-9eb3185a8d09" (UID: "75224229-25dc-4546-a8b1-9eb3185a8d09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.083536 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75224229-25dc-4546-a8b1-9eb3185a8d09" (UID: "75224229-25dc-4546-a8b1-9eb3185a8d09"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.092766 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75224229-25dc-4546-a8b1-9eb3185a8d09" (UID: "75224229-25dc-4546-a8b1-9eb3185a8d09"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.104570 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75224229-25dc-4546-a8b1-9eb3185a8d09" (UID: "75224229-25dc-4546-a8b1-9eb3185a8d09"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.114355 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tckj8\" (UniqueName: \"kubernetes.io/projected/75224229-25dc-4546-a8b1-9eb3185a8d09-kube-api-access-tckj8\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.114403 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.114416 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.114427 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.114438 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75224229-25dc-4546-a8b1-9eb3185a8d09-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.373748 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-cpwzn" event={"ID":"27b008e2-cac2-436f-9a6e-9e6415743f97","Type":"ContainerStarted","Data":"f066ceb81069fd0fbc810a5d1352dd8740d5909b2c8b4ba1fc8cc1f9104f593a"} Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.373891 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.376352 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7796644bc-cqms6" event={"ID":"75224229-25dc-4546-a8b1-9eb3185a8d09","Type":"ContainerDied","Data":"4a1f6846c0acc74a7edf9be124ba07ceb68c597d7904102b24fe04ff96f822db"} Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.376401 4792 scope.go:117] "RemoveContainer" containerID="82013f97b51b8a9a94893c4f4559d31b5b17dde8119f9a6aebf09ca2eba0e35a" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.376552 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7796644bc-cqms6" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.407215 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-589bbb667-cpwzn" podStartSLOduration=3.407198635 podStartE2EDuration="3.407198635s" podCreationTimestamp="2026-03-09 09:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:26:44.403619251 +0000 UTC m=+1169.433820003" watchObservedRunningTime="2026-03-09 09:26:44.407198635 +0000 UTC m=+1169.437399387" Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.503947 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-cqms6"] Mar 09 09:26:44 crc kubenswrapper[4792]: I0309 09:26:44.531508 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7796644bc-cqms6"] Mar 09 09:26:45 crc kubenswrapper[4792]: I0309 09:26:45.676777 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75224229-25dc-4546-a8b1-9eb3185a8d09" path="/var/lib/kubelet/pods/75224229-25dc-4546-a8b1-9eb3185a8d09/volumes" Mar 09 09:26:47 crc kubenswrapper[4792]: I0309 09:26:47.411011 4792 generic.go:334] "Generic (PLEG): container finished" podID="801f33bc-ee66-4276-baf0-d374120ed464" containerID="ac6c9779f534b4a3049af4bbb6f14ce0c56b7f97c973afb2a2e643eeefe7c69d" exitCode=0 Mar 09 09:26:47 crc kubenswrapper[4792]: I0309 09:26:47.411102 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kkglm" event={"ID":"801f33bc-ee66-4276-baf0-d374120ed464","Type":"ContainerDied","Data":"ac6c9779f534b4a3049af4bbb6f14ce0c56b7f97c973afb2a2e643eeefe7c69d"} Mar 09 09:26:51 crc kubenswrapper[4792]: I0309 09:26:51.739758 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:26:51 crc kubenswrapper[4792]: I0309 09:26:51.798782 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-hkbdp"] Mar 09 09:26:51 crc kubenswrapper[4792]: I0309 09:26:51.799088 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" podUID="62d7f343-f742-4613-83b6-a4baf01d94c7" containerName="dnsmasq-dns" containerID="cri-o://c38b09a235359396f3b9abbafbb9d101aa2bc7a0898468ee23e944a8679c30e0" gracePeriod=10 Mar 09 09:26:52 crc kubenswrapper[4792]: I0309 09:26:52.466791 4792 generic.go:334] "Generic (PLEG): container finished" podID="62d7f343-f742-4613-83b6-a4baf01d94c7" containerID="c38b09a235359396f3b9abbafbb9d101aa2bc7a0898468ee23e944a8679c30e0" exitCode=0 Mar 09 09:26:52 crc kubenswrapper[4792]: I0309 09:26:52.466977 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" event={"ID":"62d7f343-f742-4613-83b6-a4baf01d94c7","Type":"ContainerDied","Data":"c38b09a235359396f3b9abbafbb9d101aa2bc7a0898468ee23e944a8679c30e0"} Mar 09 09:26:52 crc kubenswrapper[4792]: I0309 09:26:52.652706 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" podUID="62d7f343-f742-4613-83b6-a4baf01d94c7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 09 09:26:57 crc kubenswrapper[4792]: I0309 09:26:57.653108 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" podUID="62d7f343-f742-4613-83b6-a4baf01d94c7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 09 09:27:02 crc kubenswrapper[4792]: I0309 09:27:02.652290 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" podUID="62d7f343-f742-4613-83b6-a4baf01d94c7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 09 09:27:02 crc kubenswrapper[4792]: I0309 09:27:02.652904 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.479646 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.584175 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-credential-keys\") pod \"801f33bc-ee66-4276-baf0-d374120ed464\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.584224 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-scripts\") pod \"801f33bc-ee66-4276-baf0-d374120ed464\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.584261 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-fernet-keys\") pod \"801f33bc-ee66-4276-baf0-d374120ed464\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.584357 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-config-data\") pod \"801f33bc-ee66-4276-baf0-d374120ed464\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.584435 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-combined-ca-bundle\") pod \"801f33bc-ee66-4276-baf0-d374120ed464\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.584454 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hncvt\" (UniqueName: \"kubernetes.io/projected/801f33bc-ee66-4276-baf0-d374120ed464-kube-api-access-hncvt\") pod \"801f33bc-ee66-4276-baf0-d374120ed464\" (UID: \"801f33bc-ee66-4276-baf0-d374120ed464\") " Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.593531 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-scripts" (OuterVolumeSpecName: "scripts") pod "801f33bc-ee66-4276-baf0-d374120ed464" (UID: "801f33bc-ee66-4276-baf0-d374120ed464"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.594621 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801f33bc-ee66-4276-baf0-d374120ed464-kube-api-access-hncvt" (OuterVolumeSpecName: "kube-api-access-hncvt") pod "801f33bc-ee66-4276-baf0-d374120ed464" (UID: "801f33bc-ee66-4276-baf0-d374120ed464"). InnerVolumeSpecName "kube-api-access-hncvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.596987 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "801f33bc-ee66-4276-baf0-d374120ed464" (UID: "801f33bc-ee66-4276-baf0-d374120ed464"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.602417 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kkglm" event={"ID":"801f33bc-ee66-4276-baf0-d374120ed464","Type":"ContainerDied","Data":"eeda2d82067191b3ea5bdd7b60efb947b56ed2348ea33cdc9b72b7bf629b93b9"} Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.602461 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeda2d82067191b3ea5bdd7b60efb947b56ed2348ea33cdc9b72b7bf629b93b9" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.604216 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kkglm" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.623164 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "801f33bc-ee66-4276-baf0-d374120ed464" (UID: "801f33bc-ee66-4276-baf0-d374120ed464"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.641746 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-config-data" (OuterVolumeSpecName: "config-data") pod "801f33bc-ee66-4276-baf0-d374120ed464" (UID: "801f33bc-ee66-4276-baf0-d374120ed464"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.655641 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "801f33bc-ee66-4276-baf0-d374120ed464" (UID: "801f33bc-ee66-4276-baf0-d374120ed464"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.687110 4792 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.687148 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.687159 4792 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.687169 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.687181 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801f33bc-ee66-4276-baf0-d374120ed464-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:03 crc kubenswrapper[4792]: I0309 09:27:03.687194 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hncvt\" (UniqueName: \"kubernetes.io/projected/801f33bc-ee66-4276-baf0-d374120ed464-kube-api-access-hncvt\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.578417 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kkglm"] Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.586675 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kkglm"] Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.686190 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fcfwq"] Mar 09 09:27:04 crc kubenswrapper[4792]: E0309 09:27:04.686543 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75224229-25dc-4546-a8b1-9eb3185a8d09" containerName="init" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.686557 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="75224229-25dc-4546-a8b1-9eb3185a8d09" containerName="init" Mar 09 09:27:04 crc kubenswrapper[4792]: E0309 09:27:04.686570 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801f33bc-ee66-4276-baf0-d374120ed464" containerName="keystone-bootstrap" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.686577 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="801f33bc-ee66-4276-baf0-d374120ed464" containerName="keystone-bootstrap" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.686721 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="75224229-25dc-4546-a8b1-9eb3185a8d09" containerName="init" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.686739 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="801f33bc-ee66-4276-baf0-d374120ed464" containerName="keystone-bootstrap" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.687292 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.694613 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cgfmm" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.695105 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.695367 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.695533 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.695687 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.707584 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fcfwq"] Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.719726 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtsfc\" (UniqueName: \"kubernetes.io/projected/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-kube-api-access-wtsfc\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.719780 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-combined-ca-bundle\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.719852 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-credential-keys\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.719889 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-config-data\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.719953 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-scripts\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.720023 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-fernet-keys\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: E0309 09:27:04.782122 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 09 09:27:04 crc kubenswrapper[4792]: E0309 09:27:04.782578 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qc6wl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-gkkjx_openstack(6f4bcabb-7f34-423b-a653-bd785eba0978): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:27:04 crc kubenswrapper[4792]: E0309 09:27:04.784500 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-gkkjx" podUID="6f4bcabb-7f34-423b-a653-bd785eba0978" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.822480 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-credential-keys\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.822586 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-config-data\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.822856 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-scripts\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.823939 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-fernet-keys\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.824096 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtsfc\" (UniqueName: \"kubernetes.io/projected/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-kube-api-access-wtsfc\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.824146 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-combined-ca-bundle\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.828712 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-scripts\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.828897 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-fernet-keys\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.830016 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-combined-ca-bundle\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.842791 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-credential-keys\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.843250 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-config-data\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:04 crc kubenswrapper[4792]: I0309 09:27:04.843899 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtsfc\" (UniqueName: \"kubernetes.io/projected/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-kube-api-access-wtsfc\") pod \"keystone-bootstrap-fcfwq\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.012617 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:05 crc kubenswrapper[4792]: E0309 09:27:05.290841 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d" Mar 09 09:27:05 crc kubenswrapper[4792]: E0309 09:27:05.290995 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9rh64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-xxwfp_openstack(d0e4182c-e32b-443d-85ea-0e5737c3fd1e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:27:05 crc kubenswrapper[4792]: E0309 09:27:05.292123 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-xxwfp" podUID="d0e4182c-e32b-443d-85ea-0e5737c3fd1e" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.440910 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.451964 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-ovsdbserver-sb\") pod \"62d7f343-f742-4613-83b6-a4baf01d94c7\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.452527 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-config\") pod \"62d7f343-f742-4613-83b6-a4baf01d94c7\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.553430 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-ovsdbserver-nb\") pod \"62d7f343-f742-4613-83b6-a4baf01d94c7\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.553775 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-dns-svc\") pod \"62d7f343-f742-4613-83b6-a4baf01d94c7\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.553832 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh2tt\" (UniqueName: \"kubernetes.io/projected/62d7f343-f742-4613-83b6-a4baf01d94c7-kube-api-access-vh2tt\") pod \"62d7f343-f742-4613-83b6-a4baf01d94c7\" (UID: \"62d7f343-f742-4613-83b6-a4baf01d94c7\") " Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.574482 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62d7f343-f742-4613-83b6-a4baf01d94c7-kube-api-access-vh2tt" (OuterVolumeSpecName: "kube-api-access-vh2tt") pod "62d7f343-f742-4613-83b6-a4baf01d94c7" (UID: "62d7f343-f742-4613-83b6-a4baf01d94c7"). InnerVolumeSpecName "kube-api-access-vh2tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.584673 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "62d7f343-f742-4613-83b6-a4baf01d94c7" (UID: "62d7f343-f742-4613-83b6-a4baf01d94c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.599211 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-config" (OuterVolumeSpecName: "config") pod "62d7f343-f742-4613-83b6-a4baf01d94c7" (UID: "62d7f343-f742-4613-83b6-a4baf01d94c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.631127 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee33f2-4999-4736-965e-3f5eae090e14","Type":"ContainerStarted","Data":"6ed260b81fa4eecf1111f1318337d8dac3f19ef0e3602bfa7d40018fcc911f6e"} Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.638463 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" event={"ID":"62d7f343-f742-4613-83b6-a4baf01d94c7","Type":"ContainerDied","Data":"1dcdb92679fdbc1f97a33033454be0d821d3b07fe5b3eff2a75bda68e2fb055e"} Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.638473 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6958f867f9-hkbdp" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.638541 4792 scope.go:117] "RemoveContainer" containerID="c38b09a235359396f3b9abbafbb9d101aa2bc7a0898468ee23e944a8679c30e0" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.641329 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rwg59" event={"ID":"77fab42c-43ad-48e7-bbe0-d3698e3aea96","Type":"ContainerStarted","Data":"5808485cd53cce072aa9d3ac5b5d61bc5aa0dd2076fbbbf7beccf8213d4211c2"} Mar 09 09:27:05 crc kubenswrapper[4792]: E0309 09:27:05.643744 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-gkkjx" podUID="6f4bcabb-7f34-423b-a653-bd785eba0978" Mar 09 09:27:05 crc kubenswrapper[4792]: E0309 09:27:05.643991 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d\\\"\"" pod="openstack/barbican-db-sync-xxwfp" podUID="d0e4182c-e32b-443d-85ea-0e5737c3fd1e" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.648651 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "62d7f343-f742-4613-83b6-a4baf01d94c7" (UID: "62d7f343-f742-4613-83b6-a4baf01d94c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.650132 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62d7f343-f742-4613-83b6-a4baf01d94c7" (UID: "62d7f343-f742-4613-83b6-a4baf01d94c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.656441 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.656479 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.656493 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh2tt\" (UniqueName: \"kubernetes.io/projected/62d7f343-f742-4613-83b6-a4baf01d94c7-kube-api-access-vh2tt\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.656504 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.656515 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d7f343-f742-4613-83b6-a4baf01d94c7-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.692816 4792 scope.go:117] "RemoveContainer" containerID="a493238163ffaeefdd24f981531859f3b8d3597f9c8b7c54ab8347b3cd1ec8d8" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.700231 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801f33bc-ee66-4276-baf0-d374120ed464" path="/var/lib/kubelet/pods/801f33bc-ee66-4276-baf0-d374120ed464/volumes" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.725950 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-rwg59" podStartSLOduration=2.218667279 podStartE2EDuration="24.725898339s" podCreationTimestamp="2026-03-09 09:26:41 +0000 UTC" firstStartedPulling="2026-03-09 09:26:42.793003161 +0000 UTC m=+1167.823203913" lastFinishedPulling="2026-03-09 09:27:05.300234221 +0000 UTC m=+1190.330434973" observedRunningTime="2026-03-09 09:27:05.694666532 +0000 UTC m=+1190.724867284" watchObservedRunningTime="2026-03-09 09:27:05.725898339 +0000 UTC m=+1190.756099091" Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.785457 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fcfwq"] Mar 09 09:27:05 crc kubenswrapper[4792]: W0309 09:27:05.787946 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90f2a896_0e4a_4a08_b6be_16ba96a75fcf.slice/crio-5a1c35524bf751110abe78933ec6a94c8a372ff40231adfefde5db1c26c24491 WatchSource:0}: Error finding container 5a1c35524bf751110abe78933ec6a94c8a372ff40231adfefde5db1c26c24491: Status 404 returned error can't find the container with id 5a1c35524bf751110abe78933ec6a94c8a372ff40231adfefde5db1c26c24491 Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.987417 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-hkbdp"] Mar 09 09:27:05 crc kubenswrapper[4792]: I0309 09:27:05.998420 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6958f867f9-hkbdp"] Mar 09 09:27:06 crc kubenswrapper[4792]: I0309 09:27:06.652615 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fcfwq" event={"ID":"90f2a896-0e4a-4a08-b6be-16ba96a75fcf","Type":"ContainerStarted","Data":"6a30ed979dd2c1330f83e0f87a1fb178c826aabe2ece0d8af575e24bfbe65594"} Mar 09 09:27:06 crc kubenswrapper[4792]: I0309 09:27:06.653012 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fcfwq" event={"ID":"90f2a896-0e4a-4a08-b6be-16ba96a75fcf","Type":"ContainerStarted","Data":"5a1c35524bf751110abe78933ec6a94c8a372ff40231adfefde5db1c26c24491"} Mar 09 09:27:06 crc kubenswrapper[4792]: I0309 09:27:06.675899 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fcfwq" podStartSLOduration=2.675879009 podStartE2EDuration="2.675879009s" podCreationTimestamp="2026-03-09 09:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:06.671934845 +0000 UTC m=+1191.702135607" watchObservedRunningTime="2026-03-09 09:27:06.675879009 +0000 UTC m=+1191.706079771" Mar 09 09:27:07 crc kubenswrapper[4792]: I0309 09:27:07.672149 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62d7f343-f742-4613-83b6-a4baf01d94c7" path="/var/lib/kubelet/pods/62d7f343-f742-4613-83b6-a4baf01d94c7/volumes" Mar 09 09:27:07 crc kubenswrapper[4792]: I0309 09:27:07.673165 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee33f2-4999-4736-965e-3f5eae090e14","Type":"ContainerStarted","Data":"45dc51a7fcbaafcdef79c0fb213dfea359f9ba0d8f0413168282ad54c40a8d28"} Mar 09 09:27:11 crc kubenswrapper[4792]: I0309 09:27:11.706627 4792 generic.go:334] "Generic (PLEG): container finished" podID="77fab42c-43ad-48e7-bbe0-d3698e3aea96" containerID="5808485cd53cce072aa9d3ac5b5d61bc5aa0dd2076fbbbf7beccf8213d4211c2" exitCode=0 Mar 09 09:27:11 crc kubenswrapper[4792]: I0309 09:27:11.706716 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rwg59" event={"ID":"77fab42c-43ad-48e7-bbe0-d3698e3aea96","Type":"ContainerDied","Data":"5808485cd53cce072aa9d3ac5b5d61bc5aa0dd2076fbbbf7beccf8213d4211c2"} Mar 09 09:27:12 crc kubenswrapper[4792]: I0309 09:27:12.717391 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee33f2-4999-4736-965e-3f5eae090e14","Type":"ContainerStarted","Data":"14f2ceffbc3dff050abc449151ec442615efd4ae5b2dfe03dd08b701d6fd619d"} Mar 09 09:27:12 crc kubenswrapper[4792]: I0309 09:27:12.719540 4792 generic.go:334] "Generic (PLEG): container finished" podID="90f2a896-0e4a-4a08-b6be-16ba96a75fcf" containerID="6a30ed979dd2c1330f83e0f87a1fb178c826aabe2ece0d8af575e24bfbe65594" exitCode=0 Mar 09 09:27:12 crc kubenswrapper[4792]: I0309 09:27:12.719617 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fcfwq" event={"ID":"90f2a896-0e4a-4a08-b6be-16ba96a75fcf","Type":"ContainerDied","Data":"6a30ed979dd2c1330f83e0f87a1fb178c826aabe2ece0d8af575e24bfbe65594"} Mar 09 09:27:12 crc kubenswrapper[4792]: I0309 09:27:12.721133 4792 generic.go:334] "Generic (PLEG): container finished" podID="bf438875-5be3-489e-8626-da673a088bef" containerID="61a61a611bc9aa8ff5b2a1f63fad4a53679b2714420f1ba9bcab5a31648848a4" exitCode=0 Mar 09 09:27:12 crc kubenswrapper[4792]: I0309 09:27:12.721282 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k9lg4" event={"ID":"bf438875-5be3-489e-8626-da673a088bef","Type":"ContainerDied","Data":"61a61a611bc9aa8ff5b2a1f63fad4a53679b2714420f1ba9bcab5a31648848a4"} Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.087719 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rwg59" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.214378 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.214428 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.279684 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-combined-ca-bundle\") pod \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.279743 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-config-data\") pod \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.279774 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-scripts\") pod \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.279801 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4bcq\" (UniqueName: \"kubernetes.io/projected/77fab42c-43ad-48e7-bbe0-d3698e3aea96-kube-api-access-f4bcq\") pod \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.279936 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fab42c-43ad-48e7-bbe0-d3698e3aea96-logs\") pod \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\" (UID: \"77fab42c-43ad-48e7-bbe0-d3698e3aea96\") " Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.280540 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77fab42c-43ad-48e7-bbe0-d3698e3aea96-logs" (OuterVolumeSpecName: "logs") pod "77fab42c-43ad-48e7-bbe0-d3698e3aea96" (UID: "77fab42c-43ad-48e7-bbe0-d3698e3aea96"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.281861 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fab42c-43ad-48e7-bbe0-d3698e3aea96-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.284788 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-scripts" (OuterVolumeSpecName: "scripts") pod "77fab42c-43ad-48e7-bbe0-d3698e3aea96" (UID: "77fab42c-43ad-48e7-bbe0-d3698e3aea96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.287685 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77fab42c-43ad-48e7-bbe0-d3698e3aea96-kube-api-access-f4bcq" (OuterVolumeSpecName: "kube-api-access-f4bcq") pod "77fab42c-43ad-48e7-bbe0-d3698e3aea96" (UID: "77fab42c-43ad-48e7-bbe0-d3698e3aea96"). InnerVolumeSpecName "kube-api-access-f4bcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.303367 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-config-data" (OuterVolumeSpecName: "config-data") pod "77fab42c-43ad-48e7-bbe0-d3698e3aea96" (UID: "77fab42c-43ad-48e7-bbe0-d3698e3aea96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.323613 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77fab42c-43ad-48e7-bbe0-d3698e3aea96" (UID: "77fab42c-43ad-48e7-bbe0-d3698e3aea96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.383042 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.383091 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.383100 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77fab42c-43ad-48e7-bbe0-d3698e3aea96-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.383108 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4bcq\" (UniqueName: \"kubernetes.io/projected/77fab42c-43ad-48e7-bbe0-d3698e3aea96-kube-api-access-f4bcq\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.730322 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rwg59" event={"ID":"77fab42c-43ad-48e7-bbe0-d3698e3aea96","Type":"ContainerDied","Data":"a1e5ccc03d6d9c630475dab94327b1be2f378f42924d41eeca616379af7e34ee"} Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.730676 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1e5ccc03d6d9c630475dab94327b1be2f378f42924d41eeca616379af7e34ee" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.730444 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rwg59" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.838969 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7b89dcf9c6-x4cgw"] Mar 09 09:27:13 crc kubenswrapper[4792]: E0309 09:27:13.839370 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77fab42c-43ad-48e7-bbe0-d3698e3aea96" containerName="placement-db-sync" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.839384 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="77fab42c-43ad-48e7-bbe0-d3698e3aea96" containerName="placement-db-sync" Mar 09 09:27:13 crc kubenswrapper[4792]: E0309 09:27:13.839395 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62d7f343-f742-4613-83b6-a4baf01d94c7" containerName="init" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.839401 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="62d7f343-f742-4613-83b6-a4baf01d94c7" containerName="init" Mar 09 09:27:13 crc kubenswrapper[4792]: E0309 09:27:13.839411 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62d7f343-f742-4613-83b6-a4baf01d94c7" containerName="dnsmasq-dns" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.839417 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="62d7f343-f742-4613-83b6-a4baf01d94c7" containerName="dnsmasq-dns" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.839559 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="77fab42c-43ad-48e7-bbe0-d3698e3aea96" containerName="placement-db-sync" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.839590 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="62d7f343-f742-4613-83b6-a4baf01d94c7" containerName="dnsmasq-dns" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.840425 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.846862 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.847030 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.847205 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.847983 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.849079 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zfs4f" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.882234 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b89dcf9c6-x4cgw"] Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.995273 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-scripts\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.995346 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-public-tls-certs\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.995368 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-config-data\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.995427 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-combined-ca-bundle\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.995482 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-internal-tls-certs\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.995529 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rvjj\" (UniqueName: \"kubernetes.io/projected/ecc21e88-9435-44e8-9fae-4838ae5e46ce-kube-api-access-6rvjj\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:13 crc kubenswrapper[4792]: I0309 09:27:13.995623 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecc21e88-9435-44e8-9fae-4838ae5e46ce-logs\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.097669 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecc21e88-9435-44e8-9fae-4838ae5e46ce-logs\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.097780 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-scripts\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.097850 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-public-tls-certs\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.097885 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-config-data\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.097931 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-combined-ca-bundle\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.097973 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-internal-tls-certs\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.098437 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecc21e88-9435-44e8-9fae-4838ae5e46ce-logs\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.099262 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rvjj\" (UniqueName: \"kubernetes.io/projected/ecc21e88-9435-44e8-9fae-4838ae5e46ce-kube-api-access-6rvjj\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.112823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-scripts\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.113681 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-combined-ca-bundle\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.113738 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-public-tls-certs\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.113921 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-config-data\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.114340 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-internal-tls-certs\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.123771 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rvjj\" (UniqueName: \"kubernetes.io/projected/ecc21e88-9435-44e8-9fae-4838ae5e46ce-kube-api-access-6rvjj\") pod \"placement-7b89dcf9c6-x4cgw\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.133276 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.168364 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k9lg4" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.174690 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.302632 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf438875-5be3-489e-8626-da673a088bef-combined-ca-bundle\") pod \"bf438875-5be3-489e-8626-da673a088bef\" (UID: \"bf438875-5be3-489e-8626-da673a088bef\") " Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.302841 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-credential-keys\") pod \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.302941 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf438875-5be3-489e-8626-da673a088bef-config\") pod \"bf438875-5be3-489e-8626-da673a088bef\" (UID: \"bf438875-5be3-489e-8626-da673a088bef\") " Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.302975 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7x8v\" (UniqueName: \"kubernetes.io/projected/bf438875-5be3-489e-8626-da673a088bef-kube-api-access-n7x8v\") pod \"bf438875-5be3-489e-8626-da673a088bef\" (UID: \"bf438875-5be3-489e-8626-da673a088bef\") " Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.303038 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-combined-ca-bundle\") pod \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.303120 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-fernet-keys\") pod \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.303154 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-scripts\") pod \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.303224 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-config-data\") pod \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.303251 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtsfc\" (UniqueName: \"kubernetes.io/projected/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-kube-api-access-wtsfc\") pod \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\" (UID: \"90f2a896-0e4a-4a08-b6be-16ba96a75fcf\") " Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.313459 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "90f2a896-0e4a-4a08-b6be-16ba96a75fcf" (UID: "90f2a896-0e4a-4a08-b6be-16ba96a75fcf"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.316446 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "90f2a896-0e4a-4a08-b6be-16ba96a75fcf" (UID: "90f2a896-0e4a-4a08-b6be-16ba96a75fcf"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.316512 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf438875-5be3-489e-8626-da673a088bef-kube-api-access-n7x8v" (OuterVolumeSpecName: "kube-api-access-n7x8v") pod "bf438875-5be3-489e-8626-da673a088bef" (UID: "bf438875-5be3-489e-8626-da673a088bef"). InnerVolumeSpecName "kube-api-access-n7x8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.316591 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-kube-api-access-wtsfc" (OuterVolumeSpecName: "kube-api-access-wtsfc") pod "90f2a896-0e4a-4a08-b6be-16ba96a75fcf" (UID: "90f2a896-0e4a-4a08-b6be-16ba96a75fcf"). InnerVolumeSpecName "kube-api-access-wtsfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.323994 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-scripts" (OuterVolumeSpecName: "scripts") pod "90f2a896-0e4a-4a08-b6be-16ba96a75fcf" (UID: "90f2a896-0e4a-4a08-b6be-16ba96a75fcf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.374197 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90f2a896-0e4a-4a08-b6be-16ba96a75fcf" (UID: "90f2a896-0e4a-4a08-b6be-16ba96a75fcf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.368751 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf438875-5be3-489e-8626-da673a088bef-config" (OuterVolumeSpecName: "config") pod "bf438875-5be3-489e-8626-da673a088bef" (UID: "bf438875-5be3-489e-8626-da673a088bef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.377125 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf438875-5be3-489e-8626-da673a088bef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf438875-5be3-489e-8626-da673a088bef" (UID: "bf438875-5be3-489e-8626-da673a088bef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.378518 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-config-data" (OuterVolumeSpecName: "config-data") pod "90f2a896-0e4a-4a08-b6be-16ba96a75fcf" (UID: "90f2a896-0e4a-4a08-b6be-16ba96a75fcf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.411947 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bf438875-5be3-489e-8626-da673a088bef-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.411992 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7x8v\" (UniqueName: \"kubernetes.io/projected/bf438875-5be3-489e-8626-da673a088bef-kube-api-access-n7x8v\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.412006 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.412019 4792 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.412030 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.412042 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.412054 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtsfc\" (UniqueName: \"kubernetes.io/projected/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-kube-api-access-wtsfc\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.412082 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf438875-5be3-489e-8626-da673a088bef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.412094 4792 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/90f2a896-0e4a-4a08-b6be-16ba96a75fcf-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.687675 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b89dcf9c6-x4cgw"] Mar 09 09:27:14 crc kubenswrapper[4792]: W0309 09:27:14.702675 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecc21e88_9435_44e8_9fae_4838ae5e46ce.slice/crio-de44807cdd13b5a9e4b86d076d4d2049fa9ae0044927489327ef71a9f9176c60 WatchSource:0}: Error finding container de44807cdd13b5a9e4b86d076d4d2049fa9ae0044927489327ef71a9f9176c60: Status 404 returned error can't find the container with id de44807cdd13b5a9e4b86d076d4d2049fa9ae0044927489327ef71a9f9176c60 Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.746430 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b89dcf9c6-x4cgw" event={"ID":"ecc21e88-9435-44e8-9fae-4838ae5e46ce","Type":"ContainerStarted","Data":"de44807cdd13b5a9e4b86d076d4d2049fa9ae0044927489327ef71a9f9176c60"} Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.752825 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-k9lg4" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.754402 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-k9lg4" event={"ID":"bf438875-5be3-489e-8626-da673a088bef","Type":"ContainerDied","Data":"df691a7c6767be2f0dd8dd63a690938bd797e87c1000d051688568f23aa09129"} Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.754471 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df691a7c6767be2f0dd8dd63a690938bd797e87c1000d051688568f23aa09129" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.755809 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fcfwq" event={"ID":"90f2a896-0e4a-4a08-b6be-16ba96a75fcf","Type":"ContainerDied","Data":"5a1c35524bf751110abe78933ec6a94c8a372ff40231adfefde5db1c26c24491"} Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.755848 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a1c35524bf751110abe78933ec6a94c8a372ff40231adfefde5db1c26c24491" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.755904 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fcfwq" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.943558 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5b488b889c-ks9th"] Mar 09 09:27:14 crc kubenswrapper[4792]: E0309 09:27:14.943879 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90f2a896-0e4a-4a08-b6be-16ba96a75fcf" containerName="keystone-bootstrap" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.943892 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="90f2a896-0e4a-4a08-b6be-16ba96a75fcf" containerName="keystone-bootstrap" Mar 09 09:27:14 crc kubenswrapper[4792]: E0309 09:27:14.943916 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf438875-5be3-489e-8626-da673a088bef" containerName="neutron-db-sync" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.943923 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf438875-5be3-489e-8626-da673a088bef" containerName="neutron-db-sync" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.944097 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf438875-5be3-489e-8626-da673a088bef" containerName="neutron-db-sync" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.944113 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="90f2a896-0e4a-4a08-b6be-16ba96a75fcf" containerName="keystone-bootstrap" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.944622 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.948899 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.949143 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cgfmm" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.949291 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.949401 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.949680 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.949814 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 09:27:14 crc kubenswrapper[4792]: I0309 09:27:14.969948 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b488b889c-ks9th"] Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.045854 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-xztqr"] Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.048831 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.086685 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-xztqr"] Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.123953 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-credential-keys\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.124018 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-scripts\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.124132 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp6rt\" (UniqueName: \"kubernetes.io/projected/063f2c66-7712-4aff-a002-fccc2821c91a-kube-api-access-qp6rt\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.124185 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-internal-tls-certs\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.124209 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-fernet-keys\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.124242 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-public-tls-certs\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.124264 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-combined-ca-bundle\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.124289 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-config-data\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.185297 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f75f4656d-dmz5s"] Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.197558 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.226491 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.228314 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-snjvg" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.228568 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.229182 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.231937 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8t6n\" (UniqueName: \"kubernetes.io/projected/2a8c93ed-5a50-446f-8cb2-73b098410e0a-kube-api-access-m8t6n\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.232056 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp6rt\" (UniqueName: \"kubernetes.io/projected/063f2c66-7712-4aff-a002-fccc2821c91a-kube-api-access-qp6rt\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.232185 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-internal-tls-certs\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.232231 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-fernet-keys\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.232308 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-public-tls-certs\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.232353 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-combined-ca-bundle\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.232417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-config-data\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.232460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-ovsdbserver-nb\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.232497 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-credential-keys\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.232532 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-dns-svc\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.232580 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-ovsdbserver-sb\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.232610 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-config\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.232638 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-scripts\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.250361 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f75f4656d-dmz5s"] Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.253559 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-fernet-keys\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.254138 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-public-tls-certs\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.255735 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-combined-ca-bundle\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.256262 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-internal-tls-certs\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.256580 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-scripts\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.264366 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp6rt\" (UniqueName: \"kubernetes.io/projected/063f2c66-7712-4aff-a002-fccc2821c91a-kube-api-access-qp6rt\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.265139 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-config-data\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.268610 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/063f2c66-7712-4aff-a002-fccc2821c91a-credential-keys\") pod \"keystone-5b488b889c-ks9th\" (UID: \"063f2c66-7712-4aff-a002-fccc2821c91a\") " pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.281011 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.335642 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-combined-ca-bundle\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.335697 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8t6n\" (UniqueName: \"kubernetes.io/projected/2a8c93ed-5a50-446f-8cb2-73b098410e0a-kube-api-access-m8t6n\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.335792 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-ovndb-tls-certs\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.335824 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-config\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.335841 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-httpd-config\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.335886 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-ovsdbserver-nb\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.335908 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-dns-svc\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.335928 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-ovsdbserver-sb\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.335946 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-config\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.335963 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv8wm\" (UniqueName: \"kubernetes.io/projected/4a445589-e577-4db2-a287-c4c378a13030-kube-api-access-qv8wm\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.338436 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-ovsdbserver-nb\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.338735 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-ovsdbserver-sb\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.342013 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-config\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.343860 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-dns-svc\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.355636 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8t6n\" (UniqueName: \"kubernetes.io/projected/2a8c93ed-5a50-446f-8cb2-73b098410e0a-kube-api-access-m8t6n\") pod \"dnsmasq-dns-76fc8568c7-xztqr\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.380122 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.442204 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-ovndb-tls-certs\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.442286 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-config\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.442311 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-httpd-config\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.442394 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv8wm\" (UniqueName: \"kubernetes.io/projected/4a445589-e577-4db2-a287-c4c378a13030-kube-api-access-qv8wm\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.442424 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-combined-ca-bundle\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.454016 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-config\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.456004 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5c87787d8d-6grm6"] Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.457575 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.469650 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-httpd-config\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.470234 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-combined-ca-bundle\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.479026 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c87787d8d-6grm6"] Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.482851 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-ovndb-tls-certs\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.487909 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv8wm\" (UniqueName: \"kubernetes.io/projected/4a445589-e577-4db2-a287-c4c378a13030-kube-api-access-qv8wm\") pod \"neutron-7f75f4656d-dmz5s\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.545974 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-httpd-config\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.546525 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-combined-ca-bundle\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.546572 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9w6r\" (UniqueName: \"kubernetes.io/projected/892ac2f9-60af-4080-98e4-c4100f33dbc1-kube-api-access-d9w6r\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.546607 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-config\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.546734 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-ovndb-tls-certs\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.549545 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.647869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-httpd-config\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.647976 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-combined-ca-bundle\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.648003 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9w6r\" (UniqueName: \"kubernetes.io/projected/892ac2f9-60af-4080-98e4-c4100f33dbc1-kube-api-access-d9w6r\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.648023 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-config\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.648059 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-ovndb-tls-certs\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.679401 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-combined-ca-bundle\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.680968 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-ovndb-tls-certs\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.689693 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-config\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.692504 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-httpd-config\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.719390 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9w6r\" (UniqueName: \"kubernetes.io/projected/892ac2f9-60af-4080-98e4-c4100f33dbc1-kube-api-access-d9w6r\") pod \"neutron-5c87787d8d-6grm6\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.795397 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b89dcf9c6-x4cgw" event={"ID":"ecc21e88-9435-44e8-9fae-4838ae5e46ce","Type":"ContainerStarted","Data":"02eafe46662cfecf14424d3472d3bced8851ad529264bdac4f5ef7f030a9f962"} Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.795439 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b89dcf9c6-x4cgw" event={"ID":"ecc21e88-9435-44e8-9fae-4838ae5e46ce","Type":"ContainerStarted","Data":"70031f87f3291e5b64587bd61d4148f9975d8f1294b6e77ac12d0bc16de291bc"} Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.796487 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.796511 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.801855 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:15 crc kubenswrapper[4792]: I0309 09:27:15.838754 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7b89dcf9c6-x4cgw" podStartSLOduration=2.838736225 podStartE2EDuration="2.838736225s" podCreationTimestamp="2026-03-09 09:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:15.821592888 +0000 UTC m=+1200.851793650" watchObservedRunningTime="2026-03-09 09:27:15.838736225 +0000 UTC m=+1200.868936977" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.023880 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b488b889c-ks9th"] Mar 09 09:27:16 crc kubenswrapper[4792]: W0309 09:27:16.044875 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod063f2c66_7712_4aff_a002_fccc2821c91a.slice/crio-81090f8fab00e70a98671463e8a9842b6d4a1cdb042d4ed3a95fafb9f04edc83 WatchSource:0}: Error finding container 81090f8fab00e70a98671463e8a9842b6d4a1cdb042d4ed3a95fafb9f04edc83: Status 404 returned error can't find the container with id 81090f8fab00e70a98671463e8a9842b6d4a1cdb042d4ed3a95fafb9f04edc83 Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.174610 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-xztqr"] Mar 09 09:27:16 crc kubenswrapper[4792]: W0309 09:27:16.205944 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a8c93ed_5a50_446f_8cb2_73b098410e0a.slice/crio-12ed6f6e1aa5fd05ea7a8fcba6cbe27448a62ec3a86bfdfd04d08152dd7f35fc WatchSource:0}: Error finding container 12ed6f6e1aa5fd05ea7a8fcba6cbe27448a62ec3a86bfdfd04d08152dd7f35fc: Status 404 returned error can't find the container with id 12ed6f6e1aa5fd05ea7a8fcba6cbe27448a62ec3a86bfdfd04d08152dd7f35fc Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.550787 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f75f4656d-dmz5s"] Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.668972 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d6ff87dd6-6wzmx"] Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.670898 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.699929 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d6ff87dd6-6wzmx"] Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.796084 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f7c11a-3099-487b-9126-fd90d1db1aaa-logs\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.796583 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-scripts\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.796612 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-internal-tls-certs\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.796647 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-combined-ca-bundle\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.796728 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r69vd\" (UniqueName: \"kubernetes.io/projected/52f7c11a-3099-487b-9126-fd90d1db1aaa-kube-api-access-r69vd\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.796758 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-public-tls-certs\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.796810 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-config-data\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.839591 4792 generic.go:334] "Generic (PLEG): container finished" podID="2a8c93ed-5a50-446f-8cb2-73b098410e0a" containerID="08d1751037b8abb64aedc3b550c81014a71df916e8d6d562e28acfaf9d4b07d9" exitCode=0 Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.839683 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" event={"ID":"2a8c93ed-5a50-446f-8cb2-73b098410e0a","Type":"ContainerDied","Data":"08d1751037b8abb64aedc3b550c81014a71df916e8d6d562e28acfaf9d4b07d9"} Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.839708 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" event={"ID":"2a8c93ed-5a50-446f-8cb2-73b098410e0a","Type":"ContainerStarted","Data":"12ed6f6e1aa5fd05ea7a8fcba6cbe27448a62ec3a86bfdfd04d08152dd7f35fc"} Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.850966 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b488b889c-ks9th" event={"ID":"063f2c66-7712-4aff-a002-fccc2821c91a","Type":"ContainerStarted","Data":"7916b90520b66cb275730540f6c58b3312e9e1f60c58009faf04122aed4a0446"} Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.851015 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b488b889c-ks9th" event={"ID":"063f2c66-7712-4aff-a002-fccc2821c91a","Type":"ContainerStarted","Data":"81090f8fab00e70a98671463e8a9842b6d4a1cdb042d4ed3a95fafb9f04edc83"} Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.852604 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.865925 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f75f4656d-dmz5s" event={"ID":"4a445589-e577-4db2-a287-c4c378a13030","Type":"ContainerStarted","Data":"7219fd819421db554ed2048138ef5e1d552e4486a049eb027b18480106a62459"} Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.898165 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-config-data\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.898254 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f7c11a-3099-487b-9126-fd90d1db1aaa-logs\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.898320 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-scripts\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.898339 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-internal-tls-certs\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.898391 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-combined-ca-bundle\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.898424 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r69vd\" (UniqueName: \"kubernetes.io/projected/52f7c11a-3099-487b-9126-fd90d1db1aaa-kube-api-access-r69vd\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.898448 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-public-tls-certs\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.898779 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52f7c11a-3099-487b-9126-fd90d1db1aaa-logs\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.902478 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-public-tls-certs\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.902558 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-internal-tls-certs\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.903655 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-config-data\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.910017 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-scripts\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.915890 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5b488b889c-ks9th" podStartSLOduration=2.915872787 podStartE2EDuration="2.915872787s" podCreationTimestamp="2026-03-09 09:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:16.904318741 +0000 UTC m=+1201.934519503" watchObservedRunningTime="2026-03-09 09:27:16.915872787 +0000 UTC m=+1201.946073529" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.922708 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52f7c11a-3099-487b-9126-fd90d1db1aaa-combined-ca-bundle\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:16 crc kubenswrapper[4792]: I0309 09:27:16.922963 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r69vd\" (UniqueName: \"kubernetes.io/projected/52f7c11a-3099-487b-9126-fd90d1db1aaa-kube-api-access-r69vd\") pod \"placement-6d6ff87dd6-6wzmx\" (UID: \"52f7c11a-3099-487b-9126-fd90d1db1aaa\") " pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:17 crc kubenswrapper[4792]: I0309 09:27:17.074580 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:17 crc kubenswrapper[4792]: I0309 09:27:17.254558 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c87787d8d-6grm6"] Mar 09 09:27:17 crc kubenswrapper[4792]: I0309 09:27:17.897707 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f75f4656d-dmz5s" event={"ID":"4a445589-e577-4db2-a287-c4c378a13030","Type":"ContainerStarted","Data":"3364f4b684f8772e3c518e60ae97f01621a6ae4977b748ac643f0e5ef03810d7"} Mar 09 09:27:17 crc kubenswrapper[4792]: I0309 09:27:17.901429 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f75f4656d-dmz5s" event={"ID":"4a445589-e577-4db2-a287-c4c378a13030","Type":"ContainerStarted","Data":"d8e4f48a3f8d48b098af90832eb2aed9d3856576ca12155e6cda43293deec58b"} Mar 09 09:27:17 crc kubenswrapper[4792]: I0309 09:27:17.902841 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:17 crc kubenswrapper[4792]: I0309 09:27:17.917088 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d6ff87dd6-6wzmx"] Mar 09 09:27:17 crc kubenswrapper[4792]: I0309 09:27:17.921272 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c87787d8d-6grm6" event={"ID":"892ac2f9-60af-4080-98e4-c4100f33dbc1","Type":"ContainerStarted","Data":"fab24a861a59e0830f0752b4396c28e0c1512ff77fcc9d01804f79d24424b223"} Mar 09 09:27:17 crc kubenswrapper[4792]: I0309 09:27:17.921426 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c87787d8d-6grm6" event={"ID":"892ac2f9-60af-4080-98e4-c4100f33dbc1","Type":"ContainerStarted","Data":"b7a90ccc2edaff544fc0887adb11913ff45233812d8ddfb5ad4e340ee573044d"} Mar 09 09:27:17 crc kubenswrapper[4792]: W0309 09:27:17.933968 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52f7c11a_3099_487b_9126_fd90d1db1aaa.slice/crio-7125e387dd91f276ee2bd2dd1453301f01ab09b6640e497dfd028718aba731e7 WatchSource:0}: Error finding container 7125e387dd91f276ee2bd2dd1453301f01ab09b6640e497dfd028718aba731e7: Status 404 returned error can't find the container with id 7125e387dd91f276ee2bd2dd1453301f01ab09b6640e497dfd028718aba731e7 Mar 09 09:27:17 crc kubenswrapper[4792]: I0309 09:27:17.957269 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" event={"ID":"2a8c93ed-5a50-446f-8cb2-73b098410e0a","Type":"ContainerStarted","Data":"fc589c016607b7d81e8972e1eba759e6a5c8ab52332a5f87845fdfa7117fac26"} Mar 09 09:27:17 crc kubenswrapper[4792]: I0309 09:27:17.958173 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:17 crc kubenswrapper[4792]: I0309 09:27:17.965362 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f75f4656d-dmz5s" podStartSLOduration=2.965348066 podStartE2EDuration="2.965348066s" podCreationTimestamp="2026-03-09 09:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:17.94932822 +0000 UTC m=+1202.979528972" watchObservedRunningTime="2026-03-09 09:27:17.965348066 +0000 UTC m=+1202.995548818" Mar 09 09:27:17 crc kubenswrapper[4792]: I0309 09:27:17.996940 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" podStartSLOduration=2.996921292 podStartE2EDuration="2.996921292s" podCreationTimestamp="2026-03-09 09:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:17.99411269 +0000 UTC m=+1203.024313532" watchObservedRunningTime="2026-03-09 09:27:17.996921292 +0000 UTC m=+1203.027122044" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.184930 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c87787d8d-6grm6"] Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.233861 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6cbbcf5c8f-spnsr"] Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.236616 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.238369 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.238916 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.255035 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6cbbcf5c8f-spnsr"] Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.330966 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-internal-tls-certs\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.331024 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-config\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.331088 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-ovndb-tls-certs\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.331135 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-public-tls-certs\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.331222 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-httpd-config\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.331240 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-combined-ca-bundle\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.331259 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnpwj\" (UniqueName: \"kubernetes.io/projected/4ad68345-e440-498d-a525-014a7db81ea6-kube-api-access-bnpwj\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.432692 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-ovndb-tls-certs\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.432928 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-public-tls-certs\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.433076 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-httpd-config\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.433159 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-combined-ca-bundle\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.433639 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnpwj\" (UniqueName: \"kubernetes.io/projected/4ad68345-e440-498d-a525-014a7db81ea6-kube-api-access-bnpwj\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.433865 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-internal-tls-certs\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.433978 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-config\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.441934 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-ovndb-tls-certs\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.443159 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-config\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.444355 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-httpd-config\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.445116 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-combined-ca-bundle\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.448559 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-public-tls-certs\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.452934 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad68345-e440-498d-a525-014a7db81ea6-internal-tls-certs\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.457103 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnpwj\" (UniqueName: \"kubernetes.io/projected/4ad68345-e440-498d-a525-014a7db81ea6-kube-api-access-bnpwj\") pod \"neutron-6cbbcf5c8f-spnsr\" (UID: \"4ad68345-e440-498d-a525-014a7db81ea6\") " pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.556113 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.969536 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d6ff87dd6-6wzmx" event={"ID":"52f7c11a-3099-487b-9126-fd90d1db1aaa","Type":"ContainerStarted","Data":"6f305061036b2107b4cdca603022996a4e7b177e22d0e32c612ed7fd7c3a9499"} Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.969582 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d6ff87dd6-6wzmx" event={"ID":"52f7c11a-3099-487b-9126-fd90d1db1aaa","Type":"ContainerStarted","Data":"f6a92d4f0a618069585e46a42151d7df74788c69bc864a26e4f4cc87dcf85f9a"} Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.969596 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d6ff87dd6-6wzmx" event={"ID":"52f7c11a-3099-487b-9126-fd90d1db1aaa","Type":"ContainerStarted","Data":"7125e387dd91f276ee2bd2dd1453301f01ab09b6640e497dfd028718aba731e7"} Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.970857 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.970881 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.985794 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c87787d8d-6grm6" event={"ID":"892ac2f9-60af-4080-98e4-c4100f33dbc1","Type":"ContainerStarted","Data":"5a60069321f9fa357fc9d21c6049df3e4802d3e33056fcac025afbc3369afb2c"} Mar 09 09:27:18 crc kubenswrapper[4792]: I0309 09:27:18.986584 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:19 crc kubenswrapper[4792]: I0309 09:27:19.006904 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d6ff87dd6-6wzmx" podStartSLOduration=3.006883934 podStartE2EDuration="3.006883934s" podCreationTimestamp="2026-03-09 09:27:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:18.985971536 +0000 UTC m=+1204.016172288" watchObservedRunningTime="2026-03-09 09:27:19.006883934 +0000 UTC m=+1204.037084686" Mar 09 09:27:19 crc kubenswrapper[4792]: I0309 09:27:19.024221 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5c87787d8d-6grm6" podStartSLOduration=4.024199136 podStartE2EDuration="4.024199136s" podCreationTimestamp="2026-03-09 09:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:19.019295614 +0000 UTC m=+1204.049496376" watchObservedRunningTime="2026-03-09 09:27:19.024199136 +0000 UTC m=+1204.054399888" Mar 09 09:27:19 crc kubenswrapper[4792]: I0309 09:27:19.193453 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6cbbcf5c8f-spnsr"] Mar 09 09:27:20 crc kubenswrapper[4792]: I0309 09:27:20.005665 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cbbcf5c8f-spnsr" event={"ID":"4ad68345-e440-498d-a525-014a7db81ea6","Type":"ContainerStarted","Data":"651cc612ea727dbb5f73abaa44609749fe77448f4aae62bc6301fe4835bcba8a"} Mar 09 09:27:20 crc kubenswrapper[4792]: I0309 09:27:20.005975 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cbbcf5c8f-spnsr" event={"ID":"4ad68345-e440-498d-a525-014a7db81ea6","Type":"ContainerStarted","Data":"d0bb7d6e9a2496de61b2d01a8f60e39957157e7d0d2b54f6bc6b7af7271b6e11"} Mar 09 09:27:20 crc kubenswrapper[4792]: I0309 09:27:20.005987 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cbbcf5c8f-spnsr" event={"ID":"4ad68345-e440-498d-a525-014a7db81ea6","Type":"ContainerStarted","Data":"b5068f4558ddcce8ee2d5ca5fa9cac86a041f6d0e43366762d9ecf6bf311532b"} Mar 09 09:27:20 crc kubenswrapper[4792]: I0309 09:27:20.006583 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c87787d8d-6grm6" podUID="892ac2f9-60af-4080-98e4-c4100f33dbc1" containerName="neutron-httpd" containerID="cri-o://5a60069321f9fa357fc9d21c6049df3e4802d3e33056fcac025afbc3369afb2c" gracePeriod=30 Mar 09 09:27:20 crc kubenswrapper[4792]: I0309 09:27:20.006583 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c87787d8d-6grm6" podUID="892ac2f9-60af-4080-98e4-c4100f33dbc1" containerName="neutron-api" containerID="cri-o://fab24a861a59e0830f0752b4396c28e0c1512ff77fcc9d01804f79d24424b223" gracePeriod=30 Mar 09 09:27:20 crc kubenswrapper[4792]: I0309 09:27:20.039210 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6cbbcf5c8f-spnsr" podStartSLOduration=2.039186873 podStartE2EDuration="2.039186873s" podCreationTimestamp="2026-03-09 09:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:20.026036202 +0000 UTC m=+1205.056236954" watchObservedRunningTime="2026-03-09 09:27:20.039186873 +0000 UTC m=+1205.069387635" Mar 09 09:27:21 crc kubenswrapper[4792]: I0309 09:27:21.020014 4792 generic.go:334] "Generic (PLEG): container finished" podID="892ac2f9-60af-4080-98e4-c4100f33dbc1" containerID="5a60069321f9fa357fc9d21c6049df3e4802d3e33056fcac025afbc3369afb2c" exitCode=0 Mar 09 09:27:21 crc kubenswrapper[4792]: I0309 09:27:21.021095 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c87787d8d-6grm6" event={"ID":"892ac2f9-60af-4080-98e4-c4100f33dbc1","Type":"ContainerDied","Data":"5a60069321f9fa357fc9d21c6049df3e4802d3e33056fcac025afbc3369afb2c"} Mar 09 09:27:21 crc kubenswrapper[4792]: I0309 09:27:21.021134 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:25 crc kubenswrapper[4792]: I0309 09:27:25.382336 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:25 crc kubenswrapper[4792]: I0309 09:27:25.459386 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-cpwzn"] Mar 09 09:27:25 crc kubenswrapper[4792]: I0309 09:27:25.459618 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-589bbb667-cpwzn" podUID="27b008e2-cac2-436f-9a6e-9e6415743f97" containerName="dnsmasq-dns" containerID="cri-o://f066ceb81069fd0fbc810a5d1352dd8740d5909b2c8b4ba1fc8cc1f9104f593a" gracePeriod=10 Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.017368 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.095122 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-dns-svc\") pod \"27b008e2-cac2-436f-9a6e-9e6415743f97\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.095174 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-config\") pod \"27b008e2-cac2-436f-9a6e-9e6415743f97\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.095201 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-ovsdbserver-sb\") pod \"27b008e2-cac2-436f-9a6e-9e6415743f97\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.095307 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n542r\" (UniqueName: \"kubernetes.io/projected/27b008e2-cac2-436f-9a6e-9e6415743f97-kube-api-access-n542r\") pod \"27b008e2-cac2-436f-9a6e-9e6415743f97\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.095402 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-ovsdbserver-nb\") pod \"27b008e2-cac2-436f-9a6e-9e6415743f97\" (UID: \"27b008e2-cac2-436f-9a6e-9e6415743f97\") " Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.104933 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27b008e2-cac2-436f-9a6e-9e6415743f97-kube-api-access-n542r" (OuterVolumeSpecName: "kube-api-access-n542r") pod "27b008e2-cac2-436f-9a6e-9e6415743f97" (UID: "27b008e2-cac2-436f-9a6e-9e6415743f97"). InnerVolumeSpecName "kube-api-access-n542r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.112469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xxwfp" event={"ID":"d0e4182c-e32b-443d-85ea-0e5737c3fd1e","Type":"ContainerStarted","Data":"b6991cc23ca69c37a721663b4fc8f9dbe832d06c036627daa7ea233a1170dd0c"} Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.117751 4792 generic.go:334] "Generic (PLEG): container finished" podID="27b008e2-cac2-436f-9a6e-9e6415743f97" containerID="f066ceb81069fd0fbc810a5d1352dd8740d5909b2c8b4ba1fc8cc1f9104f593a" exitCode=0 Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.117806 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-cpwzn" event={"ID":"27b008e2-cac2-436f-9a6e-9e6415743f97","Type":"ContainerDied","Data":"f066ceb81069fd0fbc810a5d1352dd8740d5909b2c8b4ba1fc8cc1f9104f593a"} Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.117828 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589bbb667-cpwzn" event={"ID":"27b008e2-cac2-436f-9a6e-9e6415743f97","Type":"ContainerDied","Data":"10db91051169a792b3ade40e82bdd70fcec59a477c2266c5beae12a084c1fcef"} Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.117845 4792 scope.go:117] "RemoveContainer" containerID="f066ceb81069fd0fbc810a5d1352dd8740d5909b2c8b4ba1fc8cc1f9104f593a" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.117942 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589bbb667-cpwzn" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.139032 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="ceilometer-central-agent" containerID="cri-o://6ed260b81fa4eecf1111f1318337d8dac3f19ef0e3602bfa7d40018fcc911f6e" gracePeriod=30 Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.139201 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="ceilometer-notification-agent" containerID="cri-o://45dc51a7fcbaafcdef79c0fb213dfea359f9ba0d8f0413168282ad54c40a8d28" gracePeriod=30 Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.139208 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="sg-core" containerID="cri-o://14f2ceffbc3dff050abc449151ec442615efd4ae5b2dfe03dd08b701d6fd619d" gracePeriod=30 Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.139337 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee33f2-4999-4736-965e-3f5eae090e14","Type":"ContainerStarted","Data":"c96a19ff2f59602ea56ae7911d4e876641fc0a43b0a5c70f885180482fe6f265"} Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.139401 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.139420 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="proxy-httpd" containerID="cri-o://c96a19ff2f59602ea56ae7911d4e876641fc0a43b0a5c70f885180482fe6f265" gracePeriod=30 Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.174290 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "27b008e2-cac2-436f-9a6e-9e6415743f97" (UID: "27b008e2-cac2-436f-9a6e-9e6415743f97"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.177697 4792 scope.go:117] "RemoveContainer" containerID="226c2c0d091990acf5a05d9f32bbe23603796d24fcbbc14e29dc58fbea2d5407" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.181288 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-config" (OuterVolumeSpecName: "config") pod "27b008e2-cac2-436f-9a6e-9e6415743f97" (UID: "27b008e2-cac2-436f-9a6e-9e6415743f97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.181670 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xxwfp" podStartSLOduration=2.822665194 podStartE2EDuration="45.181627591s" podCreationTimestamp="2026-03-09 09:26:41 +0000 UTC" firstStartedPulling="2026-03-09 09:26:43.025548522 +0000 UTC m=+1168.055749274" lastFinishedPulling="2026-03-09 09:27:25.384510919 +0000 UTC m=+1210.414711671" observedRunningTime="2026-03-09 09:27:26.136322886 +0000 UTC m=+1211.166523638" watchObservedRunningTime="2026-03-09 09:27:26.181627591 +0000 UTC m=+1211.211828343" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.183000 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.107258445 podStartE2EDuration="46.182992421s" podCreationTimestamp="2026-03-09 09:26:40 +0000 UTC" firstStartedPulling="2026-03-09 09:26:42.412929807 +0000 UTC m=+1167.443130559" lastFinishedPulling="2026-03-09 09:27:25.488663783 +0000 UTC m=+1210.518864535" observedRunningTime="2026-03-09 09:27:26.16745548 +0000 UTC m=+1211.197656232" watchObservedRunningTime="2026-03-09 09:27:26.182992421 +0000 UTC m=+1211.213193173" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.187916 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "27b008e2-cac2-436f-9a6e-9e6415743f97" (UID: "27b008e2-cac2-436f-9a6e-9e6415743f97"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.199948 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.200019 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.200032 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.200045 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n542r\" (UniqueName: \"kubernetes.io/projected/27b008e2-cac2-436f-9a6e-9e6415743f97-kube-api-access-n542r\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.201318 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "27b008e2-cac2-436f-9a6e-9e6415743f97" (UID: "27b008e2-cac2-436f-9a6e-9e6415743f97"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.223222 4792 scope.go:117] "RemoveContainer" containerID="f066ceb81069fd0fbc810a5d1352dd8740d5909b2c8b4ba1fc8cc1f9104f593a" Mar 09 09:27:26 crc kubenswrapper[4792]: E0309 09:27:26.224043 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f066ceb81069fd0fbc810a5d1352dd8740d5909b2c8b4ba1fc8cc1f9104f593a\": container with ID starting with f066ceb81069fd0fbc810a5d1352dd8740d5909b2c8b4ba1fc8cc1f9104f593a not found: ID does not exist" containerID="f066ceb81069fd0fbc810a5d1352dd8740d5909b2c8b4ba1fc8cc1f9104f593a" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.224394 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f066ceb81069fd0fbc810a5d1352dd8740d5909b2c8b4ba1fc8cc1f9104f593a"} err="failed to get container status \"f066ceb81069fd0fbc810a5d1352dd8740d5909b2c8b4ba1fc8cc1f9104f593a\": rpc error: code = NotFound desc = could not find container \"f066ceb81069fd0fbc810a5d1352dd8740d5909b2c8b4ba1fc8cc1f9104f593a\": container with ID starting with f066ceb81069fd0fbc810a5d1352dd8740d5909b2c8b4ba1fc8cc1f9104f593a not found: ID does not exist" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.224464 4792 scope.go:117] "RemoveContainer" containerID="226c2c0d091990acf5a05d9f32bbe23603796d24fcbbc14e29dc58fbea2d5407" Mar 09 09:27:26 crc kubenswrapper[4792]: E0309 09:27:26.228492 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"226c2c0d091990acf5a05d9f32bbe23603796d24fcbbc14e29dc58fbea2d5407\": container with ID starting with 226c2c0d091990acf5a05d9f32bbe23603796d24fcbbc14e29dc58fbea2d5407 not found: ID does not exist" containerID="226c2c0d091990acf5a05d9f32bbe23603796d24fcbbc14e29dc58fbea2d5407" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.228545 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"226c2c0d091990acf5a05d9f32bbe23603796d24fcbbc14e29dc58fbea2d5407"} err="failed to get container status \"226c2c0d091990acf5a05d9f32bbe23603796d24fcbbc14e29dc58fbea2d5407\": rpc error: code = NotFound desc = could not find container \"226c2c0d091990acf5a05d9f32bbe23603796d24fcbbc14e29dc58fbea2d5407\": container with ID starting with 226c2c0d091990acf5a05d9f32bbe23603796d24fcbbc14e29dc58fbea2d5407 not found: ID does not exist" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.301722 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27b008e2-cac2-436f-9a6e-9e6415743f97-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.501333 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-cpwzn"] Mar 09 09:27:26 crc kubenswrapper[4792]: I0309 09:27:26.511382 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589bbb667-cpwzn"] Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.151383 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gkkjx" event={"ID":"6f4bcabb-7f34-423b-a653-bd785eba0978","Type":"ContainerStarted","Data":"a1179b82779e6892111d8b44fd02a95030fabdd17e9af21a4abe98e7fa23b850"} Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.158265 4792 generic.go:334] "Generic (PLEG): container finished" podID="9cee33f2-4999-4736-965e-3f5eae090e14" containerID="c96a19ff2f59602ea56ae7911d4e876641fc0a43b0a5c70f885180482fe6f265" exitCode=0 Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.158301 4792 generic.go:334] "Generic (PLEG): container finished" podID="9cee33f2-4999-4736-965e-3f5eae090e14" containerID="14f2ceffbc3dff050abc449151ec442615efd4ae5b2dfe03dd08b701d6fd619d" exitCode=2 Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.158312 4792 generic.go:334] "Generic (PLEG): container finished" podID="9cee33f2-4999-4736-965e-3f5eae090e14" containerID="45dc51a7fcbaafcdef79c0fb213dfea359f9ba0d8f0413168282ad54c40a8d28" exitCode=0 Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.158319 4792 generic.go:334] "Generic (PLEG): container finished" podID="9cee33f2-4999-4736-965e-3f5eae090e14" containerID="6ed260b81fa4eecf1111f1318337d8dac3f19ef0e3602bfa7d40018fcc911f6e" exitCode=0 Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.158340 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee33f2-4999-4736-965e-3f5eae090e14","Type":"ContainerDied","Data":"c96a19ff2f59602ea56ae7911d4e876641fc0a43b0a5c70f885180482fe6f265"} Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.158367 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee33f2-4999-4736-965e-3f5eae090e14","Type":"ContainerDied","Data":"14f2ceffbc3dff050abc449151ec442615efd4ae5b2dfe03dd08b701d6fd619d"} Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.158381 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee33f2-4999-4736-965e-3f5eae090e14","Type":"ContainerDied","Data":"45dc51a7fcbaafcdef79c0fb213dfea359f9ba0d8f0413168282ad54c40a8d28"} Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.158394 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee33f2-4999-4736-965e-3f5eae090e14","Type":"ContainerDied","Data":"6ed260b81fa4eecf1111f1318337d8dac3f19ef0e3602bfa7d40018fcc911f6e"} Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.158405 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cee33f2-4999-4736-965e-3f5eae090e14","Type":"ContainerDied","Data":"a7d0b6214e676557bf8cda17343c9dc5066cea995ab1e7a782015bb17a3d782d"} Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.158418 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7d0b6214e676557bf8cda17343c9dc5066cea995ab1e7a782015bb17a3d782d" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.178437 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-gkkjx" podStartSLOduration=4.5927422700000005 podStartE2EDuration="47.178418869s" podCreationTimestamp="2026-03-09 09:26:40 +0000 UTC" firstStartedPulling="2026-03-09 09:26:42.804364361 +0000 UTC m=+1167.834565113" lastFinishedPulling="2026-03-09 09:27:25.39004097 +0000 UTC m=+1210.420241712" observedRunningTime="2026-03-09 09:27:27.173032933 +0000 UTC m=+1212.203233715" watchObservedRunningTime="2026-03-09 09:27:27.178418869 +0000 UTC m=+1212.208619621" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.212597 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.317549 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee33f2-4999-4736-965e-3f5eae090e14-log-httpd\") pod \"9cee33f2-4999-4736-965e-3f5eae090e14\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.317669 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-combined-ca-bundle\") pod \"9cee33f2-4999-4736-965e-3f5eae090e14\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.317703 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-config-data\") pod \"9cee33f2-4999-4736-965e-3f5eae090e14\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.317731 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-sg-core-conf-yaml\") pod \"9cee33f2-4999-4736-965e-3f5eae090e14\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.317766 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99zb2\" (UniqueName: \"kubernetes.io/projected/9cee33f2-4999-4736-965e-3f5eae090e14-kube-api-access-99zb2\") pod \"9cee33f2-4999-4736-965e-3f5eae090e14\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.317877 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-scripts\") pod \"9cee33f2-4999-4736-965e-3f5eae090e14\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.317920 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee33f2-4999-4736-965e-3f5eae090e14-run-httpd\") pod \"9cee33f2-4999-4736-965e-3f5eae090e14\" (UID: \"9cee33f2-4999-4736-965e-3f5eae090e14\") " Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.318521 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cee33f2-4999-4736-965e-3f5eae090e14-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9cee33f2-4999-4736-965e-3f5eae090e14" (UID: "9cee33f2-4999-4736-965e-3f5eae090e14"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.319713 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cee33f2-4999-4736-965e-3f5eae090e14-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9cee33f2-4999-4736-965e-3f5eae090e14" (UID: "9cee33f2-4999-4736-965e-3f5eae090e14"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.324384 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cee33f2-4999-4736-965e-3f5eae090e14-kube-api-access-99zb2" (OuterVolumeSpecName: "kube-api-access-99zb2") pod "9cee33f2-4999-4736-965e-3f5eae090e14" (UID: "9cee33f2-4999-4736-965e-3f5eae090e14"). InnerVolumeSpecName "kube-api-access-99zb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.328602 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-scripts" (OuterVolumeSpecName: "scripts") pod "9cee33f2-4999-4736-965e-3f5eae090e14" (UID: "9cee33f2-4999-4736-965e-3f5eae090e14"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.353861 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9cee33f2-4999-4736-965e-3f5eae090e14" (UID: "9cee33f2-4999-4736-965e-3f5eae090e14"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.418050 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cee33f2-4999-4736-965e-3f5eae090e14" (UID: "9cee33f2-4999-4736-965e-3f5eae090e14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.420514 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.420547 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee33f2-4999-4736-965e-3f5eae090e14-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.420559 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cee33f2-4999-4736-965e-3f5eae090e14-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.420569 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.420579 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.420587 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99zb2\" (UniqueName: \"kubernetes.io/projected/9cee33f2-4999-4736-965e-3f5eae090e14-kube-api-access-99zb2\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.443061 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-config-data" (OuterVolumeSpecName: "config-data") pod "9cee33f2-4999-4736-965e-3f5eae090e14" (UID: "9cee33f2-4999-4736-965e-3f5eae090e14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.522648 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cee33f2-4999-4736-965e-3f5eae090e14-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4792]: I0309 09:27:27.676995 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27b008e2-cac2-436f-9a6e-9e6415743f97" path="/var/lib/kubelet/pods/27b008e2-cac2-436f-9a6e-9e6415743f97/volumes" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.166112 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.195175 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.207864 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.219238 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:28 crc kubenswrapper[4792]: E0309 09:27:28.219551 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="proxy-httpd" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.219569 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="proxy-httpd" Mar 09 09:27:28 crc kubenswrapper[4792]: E0309 09:27:28.219581 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="sg-core" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.219587 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="sg-core" Mar 09 09:27:28 crc kubenswrapper[4792]: E0309 09:27:28.219598 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b008e2-cac2-436f-9a6e-9e6415743f97" containerName="dnsmasq-dns" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.219603 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b008e2-cac2-436f-9a6e-9e6415743f97" containerName="dnsmasq-dns" Mar 09 09:27:28 crc kubenswrapper[4792]: E0309 09:27:28.219622 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b008e2-cac2-436f-9a6e-9e6415743f97" containerName="init" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.219628 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b008e2-cac2-436f-9a6e-9e6415743f97" containerName="init" Mar 09 09:27:28 crc kubenswrapper[4792]: E0309 09:27:28.219635 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="ceilometer-central-agent" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.219641 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="ceilometer-central-agent" Mar 09 09:27:28 crc kubenswrapper[4792]: E0309 09:27:28.219658 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="ceilometer-notification-agent" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.219664 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="ceilometer-notification-agent" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.219813 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="sg-core" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.219821 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="proxy-httpd" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.219832 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b008e2-cac2-436f-9a6e-9e6415743f97" containerName="dnsmasq-dns" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.219856 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="ceilometer-notification-agent" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.219864 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" containerName="ceilometer-central-agent" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.221234 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.224930 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.225219 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.252471 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.335404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-config-data\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.335504 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9bzn\" (UniqueName: \"kubernetes.io/projected/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-kube-api-access-f9bzn\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.335529 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-scripts\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.335564 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.335591 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-run-httpd\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.335614 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.335651 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-log-httpd\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.437973 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-config-data\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.438116 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9bzn\" (UniqueName: \"kubernetes.io/projected/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-kube-api-access-f9bzn\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.438151 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-scripts\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.438197 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.438235 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-run-httpd\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.438280 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.438334 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-log-httpd\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.439693 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-log-httpd\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.440614 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-run-httpd\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.445860 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.447841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.457216 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-scripts\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.464641 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9bzn\" (UniqueName: \"kubernetes.io/projected/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-kube-api-access-f9bzn\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.465645 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-config-data\") pod \"ceilometer-0\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4792]: I0309 09:27:28.538987 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:29 crc kubenswrapper[4792]: I0309 09:27:29.238371 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:29 crc kubenswrapper[4792]: W0309 09:27:29.244880 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b07b74f_754c_43a7_97fb_a0fccb9c5df4.slice/crio-a7b45eeca0866f76a1fd2e490d769df545bf1cba304aa94880eafd14b6aa64ef WatchSource:0}: Error finding container a7b45eeca0866f76a1fd2e490d769df545bf1cba304aa94880eafd14b6aa64ef: Status 404 returned error can't find the container with id a7b45eeca0866f76a1fd2e490d769df545bf1cba304aa94880eafd14b6aa64ef Mar 09 09:27:29 crc kubenswrapper[4792]: I0309 09:27:29.675698 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cee33f2-4999-4736-965e-3f5eae090e14" path="/var/lib/kubelet/pods/9cee33f2-4999-4736-965e-3f5eae090e14/volumes" Mar 09 09:27:30 crc kubenswrapper[4792]: I0309 09:27:30.190057 4792 generic.go:334] "Generic (PLEG): container finished" podID="d0e4182c-e32b-443d-85ea-0e5737c3fd1e" containerID="b6991cc23ca69c37a721663b4fc8f9dbe832d06c036627daa7ea233a1170dd0c" exitCode=0 Mar 09 09:27:30 crc kubenswrapper[4792]: I0309 09:27:30.190443 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xxwfp" event={"ID":"d0e4182c-e32b-443d-85ea-0e5737c3fd1e","Type":"ContainerDied","Data":"b6991cc23ca69c37a721663b4fc8f9dbe832d06c036627daa7ea233a1170dd0c"} Mar 09 09:27:30 crc kubenswrapper[4792]: I0309 09:27:30.192917 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07b74f-754c-43a7-97fb-a0fccb9c5df4","Type":"ContainerStarted","Data":"26de45986757fcb8064c19b2c93d3222296aad85238631a200c492e107a4ad75"} Mar 09 09:27:30 crc kubenswrapper[4792]: I0309 09:27:30.192966 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07b74f-754c-43a7-97fb-a0fccb9c5df4","Type":"ContainerStarted","Data":"a7b45eeca0866f76a1fd2e490d769df545bf1cba304aa94880eafd14b6aa64ef"} Mar 09 09:27:31 crc kubenswrapper[4792]: I0309 09:27:31.202115 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07b74f-754c-43a7-97fb-a0fccb9c5df4","Type":"ContainerStarted","Data":"a1defc8ff6c6464e89e8fd4d8f496ae6d26672089dc614aae2dbc712eec7e8b0"} Mar 09 09:27:31 crc kubenswrapper[4792]: I0309 09:27:31.538687 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xxwfp" Mar 09 09:27:31 crc kubenswrapper[4792]: I0309 09:27:31.594207 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-combined-ca-bundle\") pod \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\" (UID: \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\") " Mar 09 09:27:31 crc kubenswrapper[4792]: I0309 09:27:31.594311 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rh64\" (UniqueName: \"kubernetes.io/projected/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-kube-api-access-9rh64\") pod \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\" (UID: \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\") " Mar 09 09:27:31 crc kubenswrapper[4792]: I0309 09:27:31.594339 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-db-sync-config-data\") pod \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\" (UID: \"d0e4182c-e32b-443d-85ea-0e5737c3fd1e\") " Mar 09 09:27:31 crc kubenswrapper[4792]: I0309 09:27:31.602444 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-kube-api-access-9rh64" (OuterVolumeSpecName: "kube-api-access-9rh64") pod "d0e4182c-e32b-443d-85ea-0e5737c3fd1e" (UID: "d0e4182c-e32b-443d-85ea-0e5737c3fd1e"). InnerVolumeSpecName "kube-api-access-9rh64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:31 crc kubenswrapper[4792]: I0309 09:27:31.608813 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d0e4182c-e32b-443d-85ea-0e5737c3fd1e" (UID: "d0e4182c-e32b-443d-85ea-0e5737c3fd1e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:31 crc kubenswrapper[4792]: I0309 09:27:31.621682 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0e4182c-e32b-443d-85ea-0e5737c3fd1e" (UID: "d0e4182c-e32b-443d-85ea-0e5737c3fd1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:31 crc kubenswrapper[4792]: I0309 09:27:31.698083 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:31 crc kubenswrapper[4792]: I0309 09:27:31.698116 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rh64\" (UniqueName: \"kubernetes.io/projected/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-kube-api-access-9rh64\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:31 crc kubenswrapper[4792]: I0309 09:27:31.698126 4792 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d0e4182c-e32b-443d-85ea-0e5737c3fd1e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.227850 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xxwfp" event={"ID":"d0e4182c-e32b-443d-85ea-0e5737c3fd1e","Type":"ContainerDied","Data":"ea95dfc4a96c29327dd0df03fa1af617d2384dc8d6c39354c570a4138a4a19d0"} Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.228178 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea95dfc4a96c29327dd0df03fa1af617d2384dc8d6c39354c570a4138a4a19d0" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.228284 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xxwfp" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.240667 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07b74f-754c-43a7-97fb-a0fccb9c5df4","Type":"ContainerStarted","Data":"2de19365c617c18e8ca70fd82e0b913fa43ed4fa2263c3bfe6c92229347f6be9"} Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.540210 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6f45884c58-b4trg"] Mar 09 09:27:32 crc kubenswrapper[4792]: E0309 09:27:32.540688 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e4182c-e32b-443d-85ea-0e5737c3fd1e" containerName="barbican-db-sync" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.540711 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e4182c-e32b-443d-85ea-0e5737c3fd1e" containerName="barbican-db-sync" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.540909 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e4182c-e32b-443d-85ea-0e5737c3fd1e" containerName="barbican-db-sync" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.542090 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.555938 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.556166 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.556279 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-68bdcc9765-czvxc"] Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.556328 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5vsw4" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.557575 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.563361 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.594490 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f45884c58-b4trg"] Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.604860 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-68bdcc9765-czvxc"] Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.619672 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7997db4c-9ed8-438f-86b3-558a6ed2be44-logs\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.619735 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbfs8\" (UniqueName: \"kubernetes.io/projected/f7122066-5687-409a-9d80-f39f2d96ad84-kube-api-access-sbfs8\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.619794 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7997db4c-9ed8-438f-86b3-558a6ed2be44-combined-ca-bundle\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.619826 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7997db4c-9ed8-438f-86b3-558a6ed2be44-config-data-custom\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.619868 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7122066-5687-409a-9d80-f39f2d96ad84-config-data\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.619902 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7997db4c-9ed8-438f-86b3-558a6ed2be44-config-data\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.619949 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7122066-5687-409a-9d80-f39f2d96ad84-combined-ca-bundle\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.619972 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7122066-5687-409a-9d80-f39f2d96ad84-config-data-custom\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.620002 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vmpm\" (UniqueName: \"kubernetes.io/projected/7997db4c-9ed8-438f-86b3-558a6ed2be44-kube-api-access-2vmpm\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.620028 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7122066-5687-409a-9d80-f39f2d96ad84-logs\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.690466 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-v7bqq"] Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.693371 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.718378 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-v7bqq"] Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.721702 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7997db4c-9ed8-438f-86b3-558a6ed2be44-config-data\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.721778 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7122066-5687-409a-9d80-f39f2d96ad84-combined-ca-bundle\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.721799 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7122066-5687-409a-9d80-f39f2d96ad84-config-data-custom\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.721836 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vmpm\" (UniqueName: \"kubernetes.io/projected/7997db4c-9ed8-438f-86b3-558a6ed2be44-kube-api-access-2vmpm\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.721877 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7122066-5687-409a-9d80-f39f2d96ad84-logs\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.721964 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbfs8\" (UniqueName: \"kubernetes.io/projected/f7122066-5687-409a-9d80-f39f2d96ad84-kube-api-access-sbfs8\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.721981 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7997db4c-9ed8-438f-86b3-558a6ed2be44-logs\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.722038 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7997db4c-9ed8-438f-86b3-558a6ed2be44-combined-ca-bundle\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.722085 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7997db4c-9ed8-438f-86b3-558a6ed2be44-config-data-custom\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.722178 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7122066-5687-409a-9d80-f39f2d96ad84-config-data\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.723392 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7122066-5687-409a-9d80-f39f2d96ad84-logs\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.725719 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7997db4c-9ed8-438f-86b3-558a6ed2be44-logs\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.739780 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7122066-5687-409a-9d80-f39f2d96ad84-config-data\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.740654 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7122066-5687-409a-9d80-f39f2d96ad84-combined-ca-bundle\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.746388 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7997db4c-9ed8-438f-86b3-558a6ed2be44-config-data\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.750164 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f7122066-5687-409a-9d80-f39f2d96ad84-config-data-custom\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.772214 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7997db4c-9ed8-438f-86b3-558a6ed2be44-combined-ca-bundle\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.772929 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vmpm\" (UniqueName: \"kubernetes.io/projected/7997db4c-9ed8-438f-86b3-558a6ed2be44-kube-api-access-2vmpm\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.790993 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbfs8\" (UniqueName: \"kubernetes.io/projected/f7122066-5687-409a-9d80-f39f2d96ad84-kube-api-access-sbfs8\") pod \"barbican-keystone-listener-6f45884c58-b4trg\" (UID: \"f7122066-5687-409a-9d80-f39f2d96ad84\") " pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.799619 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7997db4c-9ed8-438f-86b3-558a6ed2be44-config-data-custom\") pod \"barbican-worker-68bdcc9765-czvxc\" (UID: \"7997db4c-9ed8-438f-86b3-558a6ed2be44\") " pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.830159 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-ovsdbserver-sb\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.830210 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-ovsdbserver-nb\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.830248 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-config\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.830290 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ch2h\" (UniqueName: \"kubernetes.io/projected/cb47898d-02c0-4123-8d41-eb2b31580531-kube-api-access-6ch2h\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.830326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-dns-svc\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.889276 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-755d469bb8-fhbxg"] Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.890982 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.897697 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.907165 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-755d469bb8-fhbxg"] Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.936992 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.946562 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ch2h\" (UniqueName: \"kubernetes.io/projected/cb47898d-02c0-4123-8d41-eb2b31580531-kube-api-access-6ch2h\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.946612 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-config-data-custom\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.946654 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-dns-svc\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.946689 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-combined-ca-bundle\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.946713 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkzhd\" (UniqueName: \"kubernetes.io/projected/00e48c59-681a-4940-a088-c66078a15bb3-kube-api-access-kkzhd\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.946730 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-config-data\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.946787 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-ovsdbserver-sb\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.946811 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-ovsdbserver-nb\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.946845 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00e48c59-681a-4940-a088-c66078a15bb3-logs\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.946865 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-config\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.947749 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-config\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.948629 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-dns-svc\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.949171 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-ovsdbserver-sb\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.949720 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-ovsdbserver-nb\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.956483 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68bdcc9765-czvxc" Mar 09 09:27:32 crc kubenswrapper[4792]: I0309 09:27:32.983899 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ch2h\" (UniqueName: \"kubernetes.io/projected/cb47898d-02c0-4123-8d41-eb2b31580531-kube-api-access-6ch2h\") pod \"dnsmasq-dns-775688cbd9-v7bqq\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.049002 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-combined-ca-bundle\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.049348 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkzhd\" (UniqueName: \"kubernetes.io/projected/00e48c59-681a-4940-a088-c66078a15bb3-kube-api-access-kkzhd\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.049370 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-config-data\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.049455 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00e48c59-681a-4940-a088-c66078a15bb3-logs\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.049503 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-config-data-custom\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.051259 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00e48c59-681a-4940-a088-c66078a15bb3-logs\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.057273 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-config-data-custom\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.058774 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-config-data\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.064311 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-combined-ca-bundle\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.087515 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkzhd\" (UniqueName: \"kubernetes.io/projected/00e48c59-681a-4940-a088-c66078a15bb3-kube-api-access-kkzhd\") pod \"barbican-api-755d469bb8-fhbxg\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.199480 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.271996 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07b74f-754c-43a7-97fb-a0fccb9c5df4","Type":"ContainerStarted","Data":"84e0df3f63ac0b463918173b2faafdefac74af5fb2dcfe9081d17d537acbc6ff"} Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.273330 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.313678 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.829612459 podStartE2EDuration="5.313655748s" podCreationTimestamp="2026-03-09 09:27:28 +0000 UTC" firstStartedPulling="2026-03-09 09:27:29.248155988 +0000 UTC m=+1214.278356740" lastFinishedPulling="2026-03-09 09:27:32.732199277 +0000 UTC m=+1217.762400029" observedRunningTime="2026-03-09 09:27:33.303475652 +0000 UTC m=+1218.333676424" watchObservedRunningTime="2026-03-09 09:27:33.313655748 +0000 UTC m=+1218.343856500" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.322384 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.533095 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f45884c58-b4trg"] Mar 09 09:27:33 crc kubenswrapper[4792]: E0309 09:27:33.575803 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f4bcabb_7f34_423b_a653_bd785eba0978.slice/crio-conmon-a1179b82779e6892111d8b44fd02a95030fabdd17e9af21a4abe98e7fa23b850.scope\": RecentStats: unable to find data in memory cache]" Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.715501 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-68bdcc9765-czvxc"] Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.825898 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-755d469bb8-fhbxg"] Mar 09 09:27:33 crc kubenswrapper[4792]: W0309 09:27:33.991421 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb47898d_02c0_4123_8d41_eb2b31580531.slice/crio-f47a67104bd98e21fb8199f6e90303a4bce36b8fa7139f91774135af32714e2d WatchSource:0}: Error finding container f47a67104bd98e21fb8199f6e90303a4bce36b8fa7139f91774135af32714e2d: Status 404 returned error can't find the container with id f47a67104bd98e21fb8199f6e90303a4bce36b8fa7139f91774135af32714e2d Mar 09 09:27:33 crc kubenswrapper[4792]: I0309 09:27:33.996676 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-v7bqq"] Mar 09 09:27:34 crc kubenswrapper[4792]: I0309 09:27:34.286620 4792 generic.go:334] "Generic (PLEG): container finished" podID="6f4bcabb-7f34-423b-a653-bd785eba0978" containerID="a1179b82779e6892111d8b44fd02a95030fabdd17e9af21a4abe98e7fa23b850" exitCode=0 Mar 09 09:27:34 crc kubenswrapper[4792]: I0309 09:27:34.286680 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gkkjx" event={"ID":"6f4bcabb-7f34-423b-a653-bd785eba0978","Type":"ContainerDied","Data":"a1179b82779e6892111d8b44fd02a95030fabdd17e9af21a4abe98e7fa23b850"} Mar 09 09:27:34 crc kubenswrapper[4792]: I0309 09:27:34.296604 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-755d469bb8-fhbxg" event={"ID":"00e48c59-681a-4940-a088-c66078a15bb3","Type":"ContainerStarted","Data":"e23c1ade70b8647a0748f490afbcf6c7ed21a136db3f14025a113637646e446e"} Mar 09 09:27:34 crc kubenswrapper[4792]: I0309 09:27:34.296649 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-755d469bb8-fhbxg" event={"ID":"00e48c59-681a-4940-a088-c66078a15bb3","Type":"ContainerStarted","Data":"8f61f3604571733d087e67699948ee5771395b61a67f15ab18c9793f3b69bf65"} Mar 09 09:27:34 crc kubenswrapper[4792]: I0309 09:27:34.299353 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" event={"ID":"f7122066-5687-409a-9d80-f39f2d96ad84","Type":"ContainerStarted","Data":"21dc97cb8672cffb1e0413043c1c5c44256846e9efdfedb7c459e6613f97d0d7"} Mar 09 09:27:34 crc kubenswrapper[4792]: I0309 09:27:34.307863 4792 generic.go:334] "Generic (PLEG): container finished" podID="cb47898d-02c0-4123-8d41-eb2b31580531" containerID="a3bc4033441b0ab4f53c8221377d25e47a630587ba8eb39d163c2dc3b42a6556" exitCode=0 Mar 09 09:27:34 crc kubenswrapper[4792]: I0309 09:27:34.307928 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" event={"ID":"cb47898d-02c0-4123-8d41-eb2b31580531","Type":"ContainerDied","Data":"a3bc4033441b0ab4f53c8221377d25e47a630587ba8eb39d163c2dc3b42a6556"} Mar 09 09:27:34 crc kubenswrapper[4792]: I0309 09:27:34.307960 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" event={"ID":"cb47898d-02c0-4123-8d41-eb2b31580531","Type":"ContainerStarted","Data":"f47a67104bd98e21fb8199f6e90303a4bce36b8fa7139f91774135af32714e2d"} Mar 09 09:27:34 crc kubenswrapper[4792]: I0309 09:27:34.311817 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68bdcc9765-czvxc" event={"ID":"7997db4c-9ed8-438f-86b3-558a6ed2be44","Type":"ContainerStarted","Data":"6370ff9770b7b96c998fdccfd428830bf9d20e7273df36928dfa496ee48f27af"} Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.331547 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" event={"ID":"cb47898d-02c0-4123-8d41-eb2b31580531","Type":"ContainerStarted","Data":"2d2169367019d7143c545a41fee348f30f2a6cab68ec66d87d9e6f68988b3bb2"} Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.331874 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.344143 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-755d469bb8-fhbxg" event={"ID":"00e48c59-681a-4940-a088-c66078a15bb3","Type":"ContainerStarted","Data":"f24152846dc01f72d7db598e8a5a5ddcae301b7e3a4c186283f7fd5a61a395e1"} Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.344216 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.344505 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.364215 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" podStartSLOduration=3.364197469 podStartE2EDuration="3.364197469s" podCreationTimestamp="2026-03-09 09:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:35.352398527 +0000 UTC m=+1220.382599279" watchObservedRunningTime="2026-03-09 09:27:35.364197469 +0000 UTC m=+1220.394398221" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.375686 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-755d469bb8-fhbxg" podStartSLOduration=3.3756720319999998 podStartE2EDuration="3.375672032s" podCreationTimestamp="2026-03-09 09:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:35.369986507 +0000 UTC m=+1220.400187279" watchObservedRunningTime="2026-03-09 09:27:35.375672032 +0000 UTC m=+1220.405872784" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.810415 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-78f7c77b76-nw94r"] Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.812038 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.814496 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.814751 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.836223 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-78f7c77b76-nw94r"] Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.917981 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-combined-ca-bundle\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.918357 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnhj2\" (UniqueName: \"kubernetes.io/projected/2b4c9d79-a45e-457d-be41-ea8535f122c6-kube-api-access-wnhj2\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.918378 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-internal-tls-certs\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.918432 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-config-data-custom\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.918452 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b4c9d79-a45e-457d-be41-ea8535f122c6-logs\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.918491 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-config-data\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:35 crc kubenswrapper[4792]: I0309 09:27:35.918534 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-public-tls-certs\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.019842 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-combined-ca-bundle\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.019898 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnhj2\" (UniqueName: \"kubernetes.io/projected/2b4c9d79-a45e-457d-be41-ea8535f122c6-kube-api-access-wnhj2\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.019924 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-internal-tls-certs\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.019980 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-config-data-custom\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.020002 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b4c9d79-a45e-457d-be41-ea8535f122c6-logs\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.020051 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-config-data\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.020144 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-public-tls-certs\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.025895 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b4c9d79-a45e-457d-be41-ea8535f122c6-logs\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.027687 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-internal-tls-certs\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.028763 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-config-data-custom\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.030536 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-public-tls-certs\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.031559 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-config-data\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.032798 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4c9d79-a45e-457d-be41-ea8535f122c6-combined-ca-bundle\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.042765 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnhj2\" (UniqueName: \"kubernetes.io/projected/2b4c9d79-a45e-457d-be41-ea8535f122c6-kube-api-access-wnhj2\") pod \"barbican-api-78f7c77b76-nw94r\" (UID: \"2b4c9d79-a45e-457d-be41-ea8535f122c6\") " pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.117309 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.134528 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.224623 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-config-data\") pod \"6f4bcabb-7f34-423b-a653-bd785eba0978\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.224706 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f4bcabb-7f34-423b-a653-bd785eba0978-etc-machine-id\") pod \"6f4bcabb-7f34-423b-a653-bd785eba0978\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.224765 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-scripts\") pod \"6f4bcabb-7f34-423b-a653-bd785eba0978\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.224882 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-combined-ca-bundle\") pod \"6f4bcabb-7f34-423b-a653-bd785eba0978\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.224930 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc6wl\" (UniqueName: \"kubernetes.io/projected/6f4bcabb-7f34-423b-a653-bd785eba0978-kube-api-access-qc6wl\") pod \"6f4bcabb-7f34-423b-a653-bd785eba0978\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.225013 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-db-sync-config-data\") pod \"6f4bcabb-7f34-423b-a653-bd785eba0978\" (UID: \"6f4bcabb-7f34-423b-a653-bd785eba0978\") " Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.227668 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f4bcabb-7f34-423b-a653-bd785eba0978-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6f4bcabb-7f34-423b-a653-bd785eba0978" (UID: "6f4bcabb-7f34-423b-a653-bd785eba0978"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.237112 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4bcabb-7f34-423b-a653-bd785eba0978-kube-api-access-qc6wl" (OuterVolumeSpecName: "kube-api-access-qc6wl") pod "6f4bcabb-7f34-423b-a653-bd785eba0978" (UID: "6f4bcabb-7f34-423b-a653-bd785eba0978"). InnerVolumeSpecName "kube-api-access-qc6wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.239296 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6f4bcabb-7f34-423b-a653-bd785eba0978" (UID: "6f4bcabb-7f34-423b-a653-bd785eba0978"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.240295 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-scripts" (OuterVolumeSpecName: "scripts") pod "6f4bcabb-7f34-423b-a653-bd785eba0978" (UID: "6f4bcabb-7f34-423b-a653-bd785eba0978"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.290293 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f4bcabb-7f34-423b-a653-bd785eba0978" (UID: "6f4bcabb-7f34-423b-a653-bd785eba0978"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.327199 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.327227 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc6wl\" (UniqueName: \"kubernetes.io/projected/6f4bcabb-7f34-423b-a653-bd785eba0978-kube-api-access-qc6wl\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.327239 4792 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.327379 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f4bcabb-7f34-423b-a653-bd785eba0978-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.327398 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.354571 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-config-data" (OuterVolumeSpecName: "config-data") pod "6f4bcabb-7f34-423b-a653-bd785eba0978" (UID: "6f4bcabb-7f34-423b-a653-bd785eba0978"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.367488 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gkkjx" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.372183 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gkkjx" event={"ID":"6f4bcabb-7f34-423b-a653-bd785eba0978","Type":"ContainerDied","Data":"98804b54ab5b27f2e5c42cbb7b9294079232fb0e3a93fa03dcc99decffd5149e"} Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.372234 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98804b54ab5b27f2e5c42cbb7b9294079232fb0e3a93fa03dcc99decffd5149e" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.429610 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4bcabb-7f34-423b-a653-bd785eba0978-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.641307 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:27:36 crc kubenswrapper[4792]: E0309 09:27:36.642102 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4bcabb-7f34-423b-a653-bd785eba0978" containerName="cinder-db-sync" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.642124 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4bcabb-7f34-423b-a653-bd785eba0978" containerName="cinder-db-sync" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.642347 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4bcabb-7f34-423b-a653-bd785eba0978" containerName="cinder-db-sync" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.648740 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.653422 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.664262 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.665434 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.665719 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.670469 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-spxwj" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.737191 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.737436 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93e32556-b0df-482d-bb59-090a953b3d26-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.737701 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-config-data\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.737860 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxh8g\" (UniqueName: \"kubernetes.io/projected/93e32556-b0df-482d-bb59-090a953b3d26-kube-api-access-zxh8g\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.737962 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.738189 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-scripts\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.761295 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-v7bqq"] Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.838127 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7675674687-7sdtd"] Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.839713 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.847061 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.847299 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93e32556-b0df-482d-bb59-090a953b3d26-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.847359 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-config-data\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.847404 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxh8g\" (UniqueName: \"kubernetes.io/projected/93e32556-b0df-482d-bb59-090a953b3d26-kube-api-access-zxh8g\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.847424 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.847468 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-scripts\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.849396 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93e32556-b0df-482d-bb59-090a953b3d26-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.857218 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.862610 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-78f7c77b76-nw94r"] Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.871538 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-scripts\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.886178 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-config-data\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.894657 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7675674687-7sdtd"] Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.916980 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.921474 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxh8g\" (UniqueName: \"kubernetes.io/projected/93e32556-b0df-482d-bb59-090a953b3d26-kube-api-access-zxh8g\") pod \"cinder-scheduler-0\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.954563 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-ovsdbserver-sb\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.954638 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-ovsdbserver-nb\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.954661 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrphw\" (UniqueName: \"kubernetes.io/projected/143ee92a-26eb-464c-9fb5-6adbe64b31e5-kube-api-access-zrphw\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.954731 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-dns-svc\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.954926 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-config\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.987823 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.994568 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 09:27:36 crc kubenswrapper[4792]: I0309 09:27:36.995796 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.000685 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.014023 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.056411 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2nq2\" (UniqueName: \"kubernetes.io/projected/87c80ea2-7204-4631-bb42-e3c982cf60ec-kube-api-access-n2nq2\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.056472 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-ovsdbserver-sb\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.056511 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-ovsdbserver-nb\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.056533 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrphw\" (UniqueName: \"kubernetes.io/projected/143ee92a-26eb-464c-9fb5-6adbe64b31e5-kube-api-access-zrphw\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.056612 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-dns-svc\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.056636 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-config-data\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.056661 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87c80ea2-7204-4631-bb42-e3c982cf60ec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.056690 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-scripts\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.056708 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-config-data-custom\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.056729 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87c80ea2-7204-4631-bb42-e3c982cf60ec-logs\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.056755 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.056837 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-config\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.057701 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-config\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.058700 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-ovsdbserver-nb\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.059701 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-dns-svc\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.072427 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-ovsdbserver-sb\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.087205 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrphw\" (UniqueName: \"kubernetes.io/projected/143ee92a-26eb-464c-9fb5-6adbe64b31e5-kube-api-access-zrphw\") pod \"dnsmasq-dns-7675674687-7sdtd\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.158435 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-config-data\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.158498 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87c80ea2-7204-4631-bb42-e3c982cf60ec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.158545 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-scripts\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.158573 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-config-data-custom\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.158595 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87c80ea2-7204-4631-bb42-e3c982cf60ec-logs\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.158624 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.158749 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2nq2\" (UniqueName: \"kubernetes.io/projected/87c80ea2-7204-4631-bb42-e3c982cf60ec-kube-api-access-n2nq2\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.160198 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87c80ea2-7204-4631-bb42-e3c982cf60ec-logs\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.160310 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87c80ea2-7204-4631-bb42-e3c982cf60ec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.169624 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.179718 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-config-data-custom\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.180098 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-config-data\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.189419 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.193509 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-scripts\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.207088 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2nq2\" (UniqueName: \"kubernetes.io/projected/87c80ea2-7204-4631-bb42-e3c982cf60ec-kube-api-access-n2nq2\") pod \"cinder-api-0\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.408910 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.443044 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78f7c77b76-nw94r" event={"ID":"2b4c9d79-a45e-457d-be41-ea8535f122c6","Type":"ContainerStarted","Data":"faea23a071f917e4ce17243e223d4dadba4f432b5f99066d418fde165dde96ba"} Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.445703 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" event={"ID":"f7122066-5687-409a-9d80-f39f2d96ad84","Type":"ContainerStarted","Data":"78b5ca83d6d730000ca59c059ee781f7662f5bb66a134e8dbfeb87a5f5b89e6e"} Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.468820 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68bdcc9765-czvxc" event={"ID":"7997db4c-9ed8-438f-86b3-558a6ed2be44","Type":"ContainerStarted","Data":"b5449eb2a8662a82ff06488b6b13042f40275f4afc514c9daf327aa5e2d0e663"} Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.468868 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68bdcc9765-czvxc" event={"ID":"7997db4c-9ed8-438f-86b3-558a6ed2be44","Type":"ContainerStarted","Data":"8cb1dffa7fa127083c01df4970052a138e0a23b22c541298c9702274c2b1bd29"} Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.469033 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" podUID="cb47898d-02c0-4123-8d41-eb2b31580531" containerName="dnsmasq-dns" containerID="cri-o://2d2169367019d7143c545a41fee348f30f2a6cab68ec66d87d9e6f68988b3bb2" gracePeriod=10 Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.502449 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-68bdcc9765-czvxc" podStartSLOduration=3.164764979 podStartE2EDuration="5.502425736s" podCreationTimestamp="2026-03-09 09:27:32 +0000 UTC" firstStartedPulling="2026-03-09 09:27:33.714904647 +0000 UTC m=+1218.745105399" lastFinishedPulling="2026-03-09 09:27:36.052565404 +0000 UTC m=+1221.082766156" observedRunningTime="2026-03-09 09:27:37.496530545 +0000 UTC m=+1222.526731307" watchObservedRunningTime="2026-03-09 09:27:37.502425736 +0000 UTC m=+1222.532626488" Mar 09 09:27:37 crc kubenswrapper[4792]: I0309 09:27:37.752786 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.074494 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7675674687-7sdtd"] Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.126586 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.264252 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.296670 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-ovsdbserver-sb\") pod \"cb47898d-02c0-4123-8d41-eb2b31580531\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.296750 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ch2h\" (UniqueName: \"kubernetes.io/projected/cb47898d-02c0-4123-8d41-eb2b31580531-kube-api-access-6ch2h\") pod \"cb47898d-02c0-4123-8d41-eb2b31580531\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.296861 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-ovsdbserver-nb\") pod \"cb47898d-02c0-4123-8d41-eb2b31580531\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.296894 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-dns-svc\") pod \"cb47898d-02c0-4123-8d41-eb2b31580531\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.296943 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-config\") pod \"cb47898d-02c0-4123-8d41-eb2b31580531\" (UID: \"cb47898d-02c0-4123-8d41-eb2b31580531\") " Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.318757 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb47898d-02c0-4123-8d41-eb2b31580531-kube-api-access-6ch2h" (OuterVolumeSpecName: "kube-api-access-6ch2h") pod "cb47898d-02c0-4123-8d41-eb2b31580531" (UID: "cb47898d-02c0-4123-8d41-eb2b31580531"). InnerVolumeSpecName "kube-api-access-6ch2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.407630 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ch2h\" (UniqueName: \"kubernetes.io/projected/cb47898d-02c0-4123-8d41-eb2b31580531-kube-api-access-6ch2h\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.480838 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cb47898d-02c0-4123-8d41-eb2b31580531" (UID: "cb47898d-02c0-4123-8d41-eb2b31580531"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.487645 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cb47898d-02c0-4123-8d41-eb2b31580531" (UID: "cb47898d-02c0-4123-8d41-eb2b31580531"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.509218 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.509251 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.509726 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"93e32556-b0df-482d-bb59-090a953b3d26","Type":"ContainerStarted","Data":"31477fabccbc6a7118d828a23d24ba7eabc685103cc5853239a3bd4ff50cda55"} Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.510278 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-config" (OuterVolumeSpecName: "config") pod "cb47898d-02c0-4123-8d41-eb2b31580531" (UID: "cb47898d-02c0-4123-8d41-eb2b31580531"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.550129 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" event={"ID":"f7122066-5687-409a-9d80-f39f2d96ad84","Type":"ContainerStarted","Data":"cf04bc04121dd853a64074842a56fe32d5e8776ad285cb5796ea4860740a26d8"} Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.551459 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb47898d-02c0-4123-8d41-eb2b31580531" (UID: "cb47898d-02c0-4123-8d41-eb2b31580531"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.565351 4792 generic.go:334] "Generic (PLEG): container finished" podID="cb47898d-02c0-4123-8d41-eb2b31580531" containerID="2d2169367019d7143c545a41fee348f30f2a6cab68ec66d87d9e6f68988b3bb2" exitCode=0 Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.565487 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.565914 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" event={"ID":"cb47898d-02c0-4123-8d41-eb2b31580531","Type":"ContainerDied","Data":"2d2169367019d7143c545a41fee348f30f2a6cab68ec66d87d9e6f68988b3bb2"} Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.565953 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-775688cbd9-v7bqq" event={"ID":"cb47898d-02c0-4123-8d41-eb2b31580531","Type":"ContainerDied","Data":"f47a67104bd98e21fb8199f6e90303a4bce36b8fa7139f91774135af32714e2d"} Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.566021 4792 scope.go:117] "RemoveContainer" containerID="2d2169367019d7143c545a41fee348f30f2a6cab68ec66d87d9e6f68988b3bb2" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.569421 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"87c80ea2-7204-4631-bb42-e3c982cf60ec","Type":"ContainerStarted","Data":"06c08886cebb86a1d873574141539ff7676fa63690e3e9b5262b947b7628c52a"} Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.583002 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6f45884c58-b4trg" podStartSLOduration=4.111597237 podStartE2EDuration="6.582981037s" podCreationTimestamp="2026-03-09 09:27:32 +0000 UTC" firstStartedPulling="2026-03-09 09:27:33.570911026 +0000 UTC m=+1218.601111768" lastFinishedPulling="2026-03-09 09:27:36.042294816 +0000 UTC m=+1221.072495568" observedRunningTime="2026-03-09 09:27:38.580712221 +0000 UTC m=+1223.610912993" watchObservedRunningTime="2026-03-09 09:27:38.582981037 +0000 UTC m=+1223.613181789" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.594559 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78f7c77b76-nw94r" event={"ID":"2b4c9d79-a45e-457d-be41-ea8535f122c6","Type":"ContainerStarted","Data":"1344d2eeebbb48d9998fa3bfa7e980822cd3e8ed41d3de3486e848b568254b60"} Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.594600 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.594610 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-78f7c77b76-nw94r" event={"ID":"2b4c9d79-a45e-457d-be41-ea8535f122c6","Type":"ContainerStarted","Data":"7b44dd62f7e052454b96b2527361b09805c4d1225fde2bde64b83b7a52766292"} Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.594631 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.620527 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.621377 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb47898d-02c0-4123-8d41-eb2b31580531-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.645052 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-7sdtd" event={"ID":"143ee92a-26eb-464c-9fb5-6adbe64b31e5","Type":"ContainerStarted","Data":"ca765f0ed135e4ed5f385004a74f853f3b9ce90e6af2993f95ebab0d16c1af36"} Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.682710 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-78f7c77b76-nw94r" podStartSLOduration=3.6826834809999998 podStartE2EDuration="3.682683481s" podCreationTimestamp="2026-03-09 09:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:38.640854847 +0000 UTC m=+1223.671055609" watchObservedRunningTime="2026-03-09 09:27:38.682683481 +0000 UTC m=+1223.712884313" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.703764 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-v7bqq"] Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.712451 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-775688cbd9-v7bqq"] Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.730239 4792 scope.go:117] "RemoveContainer" containerID="a3bc4033441b0ab4f53c8221377d25e47a630587ba8eb39d163c2dc3b42a6556" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.820214 4792 scope.go:117] "RemoveContainer" containerID="2d2169367019d7143c545a41fee348f30f2a6cab68ec66d87d9e6f68988b3bb2" Mar 09 09:27:38 crc kubenswrapper[4792]: E0309 09:27:38.822982 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d2169367019d7143c545a41fee348f30f2a6cab68ec66d87d9e6f68988b3bb2\": container with ID starting with 2d2169367019d7143c545a41fee348f30f2a6cab68ec66d87d9e6f68988b3bb2 not found: ID does not exist" containerID="2d2169367019d7143c545a41fee348f30f2a6cab68ec66d87d9e6f68988b3bb2" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.823035 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d2169367019d7143c545a41fee348f30f2a6cab68ec66d87d9e6f68988b3bb2"} err="failed to get container status \"2d2169367019d7143c545a41fee348f30f2a6cab68ec66d87d9e6f68988b3bb2\": rpc error: code = NotFound desc = could not find container \"2d2169367019d7143c545a41fee348f30f2a6cab68ec66d87d9e6f68988b3bb2\": container with ID starting with 2d2169367019d7143c545a41fee348f30f2a6cab68ec66d87d9e6f68988b3bb2 not found: ID does not exist" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.823083 4792 scope.go:117] "RemoveContainer" containerID="a3bc4033441b0ab4f53c8221377d25e47a630587ba8eb39d163c2dc3b42a6556" Mar 09 09:27:38 crc kubenswrapper[4792]: E0309 09:27:38.833261 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3bc4033441b0ab4f53c8221377d25e47a630587ba8eb39d163c2dc3b42a6556\": container with ID starting with a3bc4033441b0ab4f53c8221377d25e47a630587ba8eb39d163c2dc3b42a6556 not found: ID does not exist" containerID="a3bc4033441b0ab4f53c8221377d25e47a630587ba8eb39d163c2dc3b42a6556" Mar 09 09:27:38 crc kubenswrapper[4792]: I0309 09:27:38.833328 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3bc4033441b0ab4f53c8221377d25e47a630587ba8eb39d163c2dc3b42a6556"} err="failed to get container status \"a3bc4033441b0ab4f53c8221377d25e47a630587ba8eb39d163c2dc3b42a6556\": rpc error: code = NotFound desc = could not find container \"a3bc4033441b0ab4f53c8221377d25e47a630587ba8eb39d163c2dc3b42a6556\": container with ID starting with a3bc4033441b0ab4f53c8221377d25e47a630587ba8eb39d163c2dc3b42a6556 not found: ID does not exist" Mar 09 09:27:39 crc kubenswrapper[4792]: I0309 09:27:39.332292 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:27:39 crc kubenswrapper[4792]: I0309 09:27:39.657557 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"87c80ea2-7204-4631-bb42-e3c982cf60ec","Type":"ContainerStarted","Data":"ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2"} Mar 09 09:27:39 crc kubenswrapper[4792]: I0309 09:27:39.669764 4792 generic.go:334] "Generic (PLEG): container finished" podID="143ee92a-26eb-464c-9fb5-6adbe64b31e5" containerID="19f9875412ad1d0ad9ceca08c40385abe736b591a5fbd07cc2f258fe77b56361" exitCode=0 Mar 09 09:27:39 crc kubenswrapper[4792]: I0309 09:27:39.682948 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb47898d-02c0-4123-8d41-eb2b31580531" path="/var/lib/kubelet/pods/cb47898d-02c0-4123-8d41-eb2b31580531/volumes" Mar 09 09:27:39 crc kubenswrapper[4792]: I0309 09:27:39.684051 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-7sdtd" event={"ID":"143ee92a-26eb-464c-9fb5-6adbe64b31e5","Type":"ContainerDied","Data":"19f9875412ad1d0ad9ceca08c40385abe736b591a5fbd07cc2f258fe77b56361"} Mar 09 09:27:40 crc kubenswrapper[4792]: I0309 09:27:40.704742 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-7sdtd" event={"ID":"143ee92a-26eb-464c-9fb5-6adbe64b31e5","Type":"ContainerStarted","Data":"406149531b4ee4071941618d21a0e78eaaf69465357a64e61e63f6fdb4ca46e9"} Mar 09 09:27:40 crc kubenswrapper[4792]: I0309 09:27:40.706269 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:40 crc kubenswrapper[4792]: I0309 09:27:40.725143 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"93e32556-b0df-482d-bb59-090a953b3d26","Type":"ContainerStarted","Data":"a446a75161f526f6a75a533bae01b5f70787846e0c11e6cf9d1ef1fe07705ea0"} Mar 09 09:27:40 crc kubenswrapper[4792]: I0309 09:27:40.738642 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7675674687-7sdtd" podStartSLOduration=4.73862083 podStartE2EDuration="4.73862083s" podCreationTimestamp="2026-03-09 09:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:40.730634548 +0000 UTC m=+1225.760835320" watchObservedRunningTime="2026-03-09 09:27:40.73862083 +0000 UTC m=+1225.768821582" Mar 09 09:27:41 crc kubenswrapper[4792]: I0309 09:27:41.738666 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"87c80ea2-7204-4631-bb42-e3c982cf60ec","Type":"ContainerStarted","Data":"6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469"} Mar 09 09:27:41 crc kubenswrapper[4792]: I0309 09:27:41.739082 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 09:27:41 crc kubenswrapper[4792]: I0309 09:27:41.739100 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="87c80ea2-7204-4631-bb42-e3c982cf60ec" containerName="cinder-api-log" containerID="cri-o://ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2" gracePeriod=30 Mar 09 09:27:41 crc kubenswrapper[4792]: I0309 09:27:41.739234 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="87c80ea2-7204-4631-bb42-e3c982cf60ec" containerName="cinder-api" containerID="cri-o://6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469" gracePeriod=30 Mar 09 09:27:41 crc kubenswrapper[4792]: I0309 09:27:41.744393 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"93e32556-b0df-482d-bb59-090a953b3d26","Type":"ContainerStarted","Data":"536c1ede202e7025a154fd71d16278b1f2471d11ca7ff4e2abc2527c2d757086"} Mar 09 09:27:41 crc kubenswrapper[4792]: I0309 09:27:41.777176 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.77715995 podStartE2EDuration="5.77715995s" podCreationTimestamp="2026-03-09 09:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:41.776758169 +0000 UTC m=+1226.806958921" watchObservedRunningTime="2026-03-09 09:27:41.77715995 +0000 UTC m=+1226.807360702" Mar 09 09:27:41 crc kubenswrapper[4792]: I0309 09:27:41.801224 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.527268073 podStartE2EDuration="5.801061094s" podCreationTimestamp="2026-03-09 09:27:36 +0000 UTC" firstStartedPulling="2026-03-09 09:27:37.804463075 +0000 UTC m=+1222.834663827" lastFinishedPulling="2026-03-09 09:27:39.078256096 +0000 UTC m=+1224.108456848" observedRunningTime="2026-03-09 09:27:41.800413946 +0000 UTC m=+1226.830614698" watchObservedRunningTime="2026-03-09 09:27:41.801061094 +0000 UTC m=+1226.831261846" Mar 09 09:27:41 crc kubenswrapper[4792]: I0309 09:27:41.996850 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.472957 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.632017 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-scripts\") pod \"87c80ea2-7204-4631-bb42-e3c982cf60ec\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.632602 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87c80ea2-7204-4631-bb42-e3c982cf60ec-etc-machine-id\") pod \"87c80ea2-7204-4631-bb42-e3c982cf60ec\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.632697 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2nq2\" (UniqueName: \"kubernetes.io/projected/87c80ea2-7204-4631-bb42-e3c982cf60ec-kube-api-access-n2nq2\") pod \"87c80ea2-7204-4631-bb42-e3c982cf60ec\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.632722 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-config-data-custom\") pod \"87c80ea2-7204-4631-bb42-e3c982cf60ec\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.632761 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-combined-ca-bundle\") pod \"87c80ea2-7204-4631-bb42-e3c982cf60ec\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.632788 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-config-data\") pod \"87c80ea2-7204-4631-bb42-e3c982cf60ec\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.632833 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87c80ea2-7204-4631-bb42-e3c982cf60ec-logs\") pod \"87c80ea2-7204-4631-bb42-e3c982cf60ec\" (UID: \"87c80ea2-7204-4631-bb42-e3c982cf60ec\") " Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.634000 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c80ea2-7204-4631-bb42-e3c982cf60ec-logs" (OuterVolumeSpecName: "logs") pod "87c80ea2-7204-4631-bb42-e3c982cf60ec" (UID: "87c80ea2-7204-4631-bb42-e3c982cf60ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.634804 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87c80ea2-7204-4631-bb42-e3c982cf60ec-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "87c80ea2-7204-4631-bb42-e3c982cf60ec" (UID: "87c80ea2-7204-4631-bb42-e3c982cf60ec"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.645200 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "87c80ea2-7204-4631-bb42-e3c982cf60ec" (UID: "87c80ea2-7204-4631-bb42-e3c982cf60ec"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.652841 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-scripts" (OuterVolumeSpecName: "scripts") pod "87c80ea2-7204-4631-bb42-e3c982cf60ec" (UID: "87c80ea2-7204-4631-bb42-e3c982cf60ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.653317 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c80ea2-7204-4631-bb42-e3c982cf60ec-kube-api-access-n2nq2" (OuterVolumeSpecName: "kube-api-access-n2nq2") pod "87c80ea2-7204-4631-bb42-e3c982cf60ec" (UID: "87c80ea2-7204-4631-bb42-e3c982cf60ec"). InnerVolumeSpecName "kube-api-access-n2nq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.696359 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87c80ea2-7204-4631-bb42-e3c982cf60ec" (UID: "87c80ea2-7204-4631-bb42-e3c982cf60ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.701298 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-config-data" (OuterVolumeSpecName: "config-data") pod "87c80ea2-7204-4631-bb42-e3c982cf60ec" (UID: "87c80ea2-7204-4631-bb42-e3c982cf60ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.735378 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.735615 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.735693 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87c80ea2-7204-4631-bb42-e3c982cf60ec-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.735750 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.735811 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87c80ea2-7204-4631-bb42-e3c982cf60ec-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.735866 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2nq2\" (UniqueName: \"kubernetes.io/projected/87c80ea2-7204-4631-bb42-e3c982cf60ec-kube-api-access-n2nq2\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.735918 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87c80ea2-7204-4631-bb42-e3c982cf60ec-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.756582 4792 generic.go:334] "Generic (PLEG): container finished" podID="87c80ea2-7204-4631-bb42-e3c982cf60ec" containerID="6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469" exitCode=0 Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.756614 4792 generic.go:334] "Generic (PLEG): container finished" podID="87c80ea2-7204-4631-bb42-e3c982cf60ec" containerID="ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2" exitCode=143 Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.757634 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.759596 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"87c80ea2-7204-4631-bb42-e3c982cf60ec","Type":"ContainerDied","Data":"6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469"} Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.759646 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"87c80ea2-7204-4631-bb42-e3c982cf60ec","Type":"ContainerDied","Data":"ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2"} Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.759661 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"87c80ea2-7204-4631-bb42-e3c982cf60ec","Type":"ContainerDied","Data":"06c08886cebb86a1d873574141539ff7676fa63690e3e9b5262b947b7628c52a"} Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.759679 4792 scope.go:117] "RemoveContainer" containerID="6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.800379 4792 scope.go:117] "RemoveContainer" containerID="ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.832501 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.857287 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.866560 4792 scope.go:117] "RemoveContainer" containerID="6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469" Mar 09 09:27:42 crc kubenswrapper[4792]: E0309 09:27:42.867000 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469\": container with ID starting with 6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469 not found: ID does not exist" containerID="6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.867030 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469"} err="failed to get container status \"6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469\": rpc error: code = NotFound desc = could not find container \"6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469\": container with ID starting with 6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469 not found: ID does not exist" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.867052 4792 scope.go:117] "RemoveContainer" containerID="ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2" Mar 09 09:27:42 crc kubenswrapper[4792]: E0309 09:27:42.867424 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2\": container with ID starting with ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2 not found: ID does not exist" containerID="ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.867446 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2"} err="failed to get container status \"ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2\": rpc error: code = NotFound desc = could not find container \"ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2\": container with ID starting with ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2 not found: ID does not exist" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.867462 4792 scope.go:117] "RemoveContainer" containerID="6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.867804 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469"} err="failed to get container status \"6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469\": rpc error: code = NotFound desc = could not find container \"6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469\": container with ID starting with 6e672f003741e51916f686f13532cb7a58bc7b864565012e477a1b2a8c638469 not found: ID does not exist" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.867826 4792 scope.go:117] "RemoveContainer" containerID="ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.868304 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2"} err="failed to get container status \"ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2\": rpc error: code = NotFound desc = could not find container \"ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2\": container with ID starting with ca0de711b96f591bfb673b34381a3f538c711de3102cb5f7e1e395aa6b7710b2 not found: ID does not exist" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.886914 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:27:42 crc kubenswrapper[4792]: E0309 09:27:42.887404 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb47898d-02c0-4123-8d41-eb2b31580531" containerName="init" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.887429 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb47898d-02c0-4123-8d41-eb2b31580531" containerName="init" Mar 09 09:27:42 crc kubenswrapper[4792]: E0309 09:27:42.887464 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb47898d-02c0-4123-8d41-eb2b31580531" containerName="dnsmasq-dns" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.887475 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb47898d-02c0-4123-8d41-eb2b31580531" containerName="dnsmasq-dns" Mar 09 09:27:42 crc kubenswrapper[4792]: E0309 09:27:42.887494 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c80ea2-7204-4631-bb42-e3c982cf60ec" containerName="cinder-api" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.887502 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c80ea2-7204-4631-bb42-e3c982cf60ec" containerName="cinder-api" Mar 09 09:27:42 crc kubenswrapper[4792]: E0309 09:27:42.887519 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c80ea2-7204-4631-bb42-e3c982cf60ec" containerName="cinder-api-log" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.887527 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c80ea2-7204-4631-bb42-e3c982cf60ec" containerName="cinder-api-log" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.887768 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb47898d-02c0-4123-8d41-eb2b31580531" containerName="dnsmasq-dns" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.887806 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c80ea2-7204-4631-bb42-e3c982cf60ec" containerName="cinder-api-log" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.887821 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c80ea2-7204-4631-bb42-e3c982cf60ec" containerName="cinder-api" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.888972 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.893401 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.911844 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.912876 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.913149 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.954202 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-config-data\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.954285 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-config-data-custom\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.954306 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js5xp\" (UniqueName: \"kubernetes.io/projected/d0b2c9c1-464f-4058-aa55-ce041668d8a2-kube-api-access-js5xp\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.954365 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.954387 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-scripts\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.954421 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b2c9c1-464f-4058-aa55-ce041668d8a2-logs\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.954448 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.954482 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0b2c9c1-464f-4058-aa55-ce041668d8a2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:42 crc kubenswrapper[4792]: I0309 09:27:42.954520 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.056227 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b2c9c1-464f-4058-aa55-ce041668d8a2-logs\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.056287 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.056330 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0b2c9c1-464f-4058-aa55-ce041668d8a2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.056366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.056432 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-config-data\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.056485 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-config-data-custom\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.056502 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js5xp\" (UniqueName: \"kubernetes.io/projected/d0b2c9c1-464f-4058-aa55-ce041668d8a2-kube-api-access-js5xp\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.056579 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.056598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-scripts\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.059398 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0b2c9c1-464f-4058-aa55-ce041668d8a2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.059823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0b2c9c1-464f-4058-aa55-ce041668d8a2-logs\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.061203 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-scripts\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.071843 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.072006 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-config-data-custom\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.072381 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.072451 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.073178 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b2c9c1-464f-4058-aa55-ce041668d8a2-config-data\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.084649 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js5xp\" (UniqueName: \"kubernetes.io/projected/d0b2c9c1-464f-4058-aa55-ce041668d8a2-kube-api-access-js5xp\") pod \"cinder-api-0\" (UID: \"d0b2c9c1-464f-4058-aa55-ce041668d8a2\") " pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.214338 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.214396 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.214436 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.215084 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"338559ddc83aaf62922dd4a2c3548afe39aad0e71765a9e21715a3c207fc6015"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.215140 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://338559ddc83aaf62922dd4a2c3548afe39aad0e71765a9e21715a3c207fc6015" gracePeriod=600 Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.252188 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.699514 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87c80ea2-7204-4631-bb42-e3c982cf60ec" path="/var/lib/kubelet/pods/87c80ea2-7204-4631-bb42-e3c982cf60ec/volumes" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.789050 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="338559ddc83aaf62922dd4a2c3548afe39aad0e71765a9e21715a3c207fc6015" exitCode=0 Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.789143 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"338559ddc83aaf62922dd4a2c3548afe39aad0e71765a9e21715a3c207fc6015"} Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.789175 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"80c12a8064763d9c808b56945a8c97d0c627b1e8c20ccecc1138b2635c8e12bd"} Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.789194 4792 scope.go:117] "RemoveContainer" containerID="db2023e6b3ec28be4276e65d0d9cd090ae22fa8851acb261970e9cecf046c144" Mar 09 09:27:43 crc kubenswrapper[4792]: I0309 09:27:43.989391 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:27:43 crc kubenswrapper[4792]: W0309 09:27:43.990528 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0b2c9c1_464f_4058_aa55_ce041668d8a2.slice/crio-2b39e6076b082287c803f3ef576c0ea610ba7f69425cc60c2997e055391ae0f3 WatchSource:0}: Error finding container 2b39e6076b082287c803f3ef576c0ea610ba7f69425cc60c2997e055391ae0f3: Status 404 returned error can't find the container with id 2b39e6076b082287c803f3ef576c0ea610ba7f69425cc60c2997e055391ae0f3 Mar 09 09:27:44 crc kubenswrapper[4792]: I0309 09:27:44.817427 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d0b2c9c1-464f-4058-aa55-ce041668d8a2","Type":"ContainerStarted","Data":"2b39e6076b082287c803f3ef576c0ea610ba7f69425cc60c2997e055391ae0f3"} Mar 09 09:27:45 crc kubenswrapper[4792]: I0309 09:27:45.588163 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:45 crc kubenswrapper[4792]: I0309 09:27:45.819251 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5c87787d8d-6grm6" podUID="892ac2f9-60af-4080-98e4-c4100f33dbc1" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.146:9696/\": dial tcp 10.217.0.146:9696: connect: connection refused" Mar 09 09:27:45 crc kubenswrapper[4792]: I0309 09:27:45.849646 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d0b2c9c1-464f-4058-aa55-ce041668d8a2","Type":"ContainerStarted","Data":"32cdc8b1459381717cf6bb3e2ecb74a2cf03820395b305e00076a85148d67c49"} Mar 09 09:27:46 crc kubenswrapper[4792]: I0309 09:27:46.860372 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d0b2c9c1-464f-4058-aa55-ce041668d8a2","Type":"ContainerStarted","Data":"a9d5b8054f7bdf25daa52f8b538c74d8fb875f4028e6a9cf8235360652335961"} Mar 09 09:27:46 crc kubenswrapper[4792]: I0309 09:27:46.861008 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 09:27:46 crc kubenswrapper[4792]: I0309 09:27:46.886240 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.886221553 podStartE2EDuration="4.886221553s" podCreationTimestamp="2026-03-09 09:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:46.883045291 +0000 UTC m=+1231.913246053" watchObservedRunningTime="2026-03-09 09:27:46.886221553 +0000 UTC m=+1231.916422305" Mar 09 09:27:47 crc kubenswrapper[4792]: I0309 09:27:47.088368 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:47 crc kubenswrapper[4792]: I0309 09:27:47.126862 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:47 crc kubenswrapper[4792]: I0309 09:27:47.184269 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:27:47 crc kubenswrapper[4792]: I0309 09:27:47.352318 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-xztqr"] Mar 09 09:27:47 crc kubenswrapper[4792]: I0309 09:27:47.352579 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" podUID="2a8c93ed-5a50-446f-8cb2-73b098410e0a" containerName="dnsmasq-dns" containerID="cri-o://fc589c016607b7d81e8972e1eba759e6a5c8ab52332a5f87845fdfa7117fac26" gracePeriod=10 Mar 09 09:27:47 crc kubenswrapper[4792]: I0309 09:27:47.910474 4792 generic.go:334] "Generic (PLEG): container finished" podID="2a8c93ed-5a50-446f-8cb2-73b098410e0a" containerID="fc589c016607b7d81e8972e1eba759e6a5c8ab52332a5f87845fdfa7117fac26" exitCode=0 Mar 09 09:27:47 crc kubenswrapper[4792]: I0309 09:27:47.911839 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" event={"ID":"2a8c93ed-5a50-446f-8cb2-73b098410e0a","Type":"ContainerDied","Data":"fc589c016607b7d81e8972e1eba759e6a5c8ab52332a5f87845fdfa7117fac26"} Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.000338 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.103860 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.137966 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8t6n\" (UniqueName: \"kubernetes.io/projected/2a8c93ed-5a50-446f-8cb2-73b098410e0a-kube-api-access-m8t6n\") pod \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.138039 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-dns-svc\") pod \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.138160 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-ovsdbserver-nb\") pod \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.138204 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-config\") pod \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.138226 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-ovsdbserver-sb\") pod \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\" (UID: \"2a8c93ed-5a50-446f-8cb2-73b098410e0a\") " Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.200939 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8c93ed-5a50-446f-8cb2-73b098410e0a-kube-api-access-m8t6n" (OuterVolumeSpecName: "kube-api-access-m8t6n") pod "2a8c93ed-5a50-446f-8cb2-73b098410e0a" (UID: "2a8c93ed-5a50-446f-8cb2-73b098410e0a"). InnerVolumeSpecName "kube-api-access-m8t6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.243303 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8t6n\" (UniqueName: \"kubernetes.io/projected/2a8c93ed-5a50-446f-8cb2-73b098410e0a-kube-api-access-m8t6n\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.281692 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2a8c93ed-5a50-446f-8cb2-73b098410e0a" (UID: "2a8c93ed-5a50-446f-8cb2-73b098410e0a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.293421 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2a8c93ed-5a50-446f-8cb2-73b098410e0a" (UID: "2a8c93ed-5a50-446f-8cb2-73b098410e0a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.297837 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2a8c93ed-5a50-446f-8cb2-73b098410e0a" (UID: "2a8c93ed-5a50-446f-8cb2-73b098410e0a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.314314 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-config" (OuterVolumeSpecName: "config") pod "2a8c93ed-5a50-446f-8cb2-73b098410e0a" (UID: "2a8c93ed-5a50-446f-8cb2-73b098410e0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.345954 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.346224 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.346298 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.346368 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8c93ed-5a50-446f-8cb2-73b098410e0a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.366888 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-755d469bb8-fhbxg" podUID="00e48c59-681a-4940-a088-c66078a15bb3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.378172 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.545594 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.602405 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6cbbcf5c8f-spnsr" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.603942 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.669670 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f75f4656d-dmz5s"] Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.669881 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f75f4656d-dmz5s" podUID="4a445589-e577-4db2-a287-c4c378a13030" containerName="neutron-api" containerID="cri-o://d8e4f48a3f8d48b098af90832eb2aed9d3856576ca12155e6cda43293deec58b" gracePeriod=30 Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.671211 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f75f4656d-dmz5s" podUID="4a445589-e577-4db2-a287-c4c378a13030" containerName="neutron-httpd" containerID="cri-o://3364f4b684f8772e3c518e60ae97f01621a6ae4977b748ac643f0e5ef03810d7" gracePeriod=30 Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.924693 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.927210 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fc8568c7-xztqr" event={"ID":"2a8c93ed-5a50-446f-8cb2-73b098410e0a","Type":"ContainerDied","Data":"12ed6f6e1aa5fd05ea7a8fcba6cbe27448a62ec3a86bfdfd04d08152dd7f35fc"} Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.927256 4792 scope.go:117] "RemoveContainer" containerID="fc589c016607b7d81e8972e1eba759e6a5c8ab52332a5f87845fdfa7117fac26" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.927831 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="93e32556-b0df-482d-bb59-090a953b3d26" containerName="cinder-scheduler" containerID="cri-o://a446a75161f526f6a75a533bae01b5f70787846e0c11e6cf9d1ef1fe07705ea0" gracePeriod=30 Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.927932 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="93e32556-b0df-482d-bb59-090a953b3d26" containerName="probe" containerID="cri-o://536c1ede202e7025a154fd71d16278b1f2471d11ca7ff4e2abc2527c2d757086" gracePeriod=30 Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.984549 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-xztqr"] Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.989799 4792 scope.go:117] "RemoveContainer" containerID="08d1751037b8abb64aedc3b550c81014a71df916e8d6d562e28acfaf9d4b07d9" Mar 09 09:27:48 crc kubenswrapper[4792]: I0309 09:27:48.992590 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fc8568c7-xztqr"] Mar 09 09:27:49 crc kubenswrapper[4792]: I0309 09:27:49.676114 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8c93ed-5a50-446f-8cb2-73b098410e0a" path="/var/lib/kubelet/pods/2a8c93ed-5a50-446f-8cb2-73b098410e0a/volumes" Mar 09 09:27:49 crc kubenswrapper[4792]: I0309 09:27:49.876135 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:49 crc kubenswrapper[4792]: I0309 09:27:49.877457 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d6ff87dd6-6wzmx" Mar 09 09:27:49 crc kubenswrapper[4792]: I0309 09:27:49.940470 4792 generic.go:334] "Generic (PLEG): container finished" podID="4a445589-e577-4db2-a287-c4c378a13030" containerID="3364f4b684f8772e3c518e60ae97f01621a6ae4977b748ac643f0e5ef03810d7" exitCode=0 Mar 09 09:27:49 crc kubenswrapper[4792]: I0309 09:27:49.940544 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f75f4656d-dmz5s" event={"ID":"4a445589-e577-4db2-a287-c4c378a13030","Type":"ContainerDied","Data":"3364f4b684f8772e3c518e60ae97f01621a6ae4977b748ac643f0e5ef03810d7"} Mar 09 09:27:49 crc kubenswrapper[4792]: I0309 09:27:49.974937 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7b89dcf9c6-x4cgw"] Mar 09 09:27:49 crc kubenswrapper[4792]: I0309 09:27:49.975220 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7b89dcf9c6-x4cgw" podUID="ecc21e88-9435-44e8-9fae-4838ae5e46ce" containerName="placement-log" containerID="cri-o://70031f87f3291e5b64587bd61d4148f9975d8f1294b6e77ac12d0bc16de291bc" gracePeriod=30 Mar 09 09:27:49 crc kubenswrapper[4792]: I0309 09:27:49.975665 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7b89dcf9c6-x4cgw" podUID="ecc21e88-9435-44e8-9fae-4838ae5e46ce" containerName="placement-api" containerID="cri-o://02eafe46662cfecf14424d3472d3bced8851ad529264bdac4f5ef7f030a9f962" gracePeriod=30 Mar 09 09:27:50 crc kubenswrapper[4792]: I0309 09:27:50.157057 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-78f7c77b76-nw94r" podUID="2b4c9d79-a45e-457d-be41-ea8535f122c6" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.154:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:27:50 crc kubenswrapper[4792]: I0309 09:27:50.163568 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-78f7c77b76-nw94r" podUID="2b4c9d79-a45e-457d-be41-ea8535f122c6" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.154:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:27:50 crc kubenswrapper[4792]: I0309 09:27:50.707593 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.032828 4792 generic.go:334] "Generic (PLEG): container finished" podID="ecc21e88-9435-44e8-9fae-4838ae5e46ce" containerID="70031f87f3291e5b64587bd61d4148f9975d8f1294b6e77ac12d0bc16de291bc" exitCode=143 Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.033455 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b89dcf9c6-x4cgw" event={"ID":"ecc21e88-9435-44e8-9fae-4838ae5e46ce","Type":"ContainerDied","Data":"70031f87f3291e5b64587bd61d4148f9975d8f1294b6e77ac12d0bc16de291bc"} Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.041380 4792 generic.go:334] "Generic (PLEG): container finished" podID="93e32556-b0df-482d-bb59-090a953b3d26" containerID="536c1ede202e7025a154fd71d16278b1f2471d11ca7ff4e2abc2527c2d757086" exitCode=0 Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.041677 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"93e32556-b0df-482d-bb59-090a953b3d26","Type":"ContainerDied","Data":"536c1ede202e7025a154fd71d16278b1f2471d11ca7ff4e2abc2527c2d757086"} Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.080555 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c87787d8d-6grm6_892ac2f9-60af-4080-98e4-c4100f33dbc1/neutron-api/0.log" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.080620 4792 generic.go:334] "Generic (PLEG): container finished" podID="892ac2f9-60af-4080-98e4-c4100f33dbc1" containerID="fab24a861a59e0830f0752b4396c28e0c1512ff77fcc9d01804f79d24424b223" exitCode=137 Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.080656 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c87787d8d-6grm6" event={"ID":"892ac2f9-60af-4080-98e4-c4100f33dbc1","Type":"ContainerDied","Data":"fab24a861a59e0830f0752b4396c28e0c1512ff77fcc9d01804f79d24424b223"} Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.153581 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-78f7c77b76-nw94r" podUID="2b4c9d79-a45e-457d-be41-ea8535f122c6" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.154:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.339093 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c87787d8d-6grm6_892ac2f9-60af-4080-98e4-c4100f33dbc1/neutron-api/0.log" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.339473 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.393246 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-78f7c77b76-nw94r" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.448649 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9w6r\" (UniqueName: \"kubernetes.io/projected/892ac2f9-60af-4080-98e4-c4100f33dbc1-kube-api-access-d9w6r\") pod \"892ac2f9-60af-4080-98e4-c4100f33dbc1\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.448778 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-httpd-config\") pod \"892ac2f9-60af-4080-98e4-c4100f33dbc1\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.448811 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-config\") pod \"892ac2f9-60af-4080-98e4-c4100f33dbc1\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.448846 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-ovndb-tls-certs\") pod \"892ac2f9-60af-4080-98e4-c4100f33dbc1\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.448868 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-combined-ca-bundle\") pod \"892ac2f9-60af-4080-98e4-c4100f33dbc1\" (UID: \"892ac2f9-60af-4080-98e4-c4100f33dbc1\") " Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.479929 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "892ac2f9-60af-4080-98e4-c4100f33dbc1" (UID: "892ac2f9-60af-4080-98e4-c4100f33dbc1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.481387 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/892ac2f9-60af-4080-98e4-c4100f33dbc1-kube-api-access-d9w6r" (OuterVolumeSpecName: "kube-api-access-d9w6r") pod "892ac2f9-60af-4080-98e4-c4100f33dbc1" (UID: "892ac2f9-60af-4080-98e4-c4100f33dbc1"). InnerVolumeSpecName "kube-api-access-d9w6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.553518 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9w6r\" (UniqueName: \"kubernetes.io/projected/892ac2f9-60af-4080-98e4-c4100f33dbc1-kube-api-access-d9w6r\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.553619 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.559027 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-755d469bb8-fhbxg"] Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.559300 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-755d469bb8-fhbxg" podUID="00e48c59-681a-4940-a088-c66078a15bb3" containerName="barbican-api-log" containerID="cri-o://e23c1ade70b8647a0748f490afbcf6c7ed21a136db3f14025a113637646e446e" gracePeriod=30 Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.559801 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-755d469bb8-fhbxg" podUID="00e48c59-681a-4940-a088-c66078a15bb3" containerName="barbican-api" containerID="cri-o://f24152846dc01f72d7db598e8a5a5ddcae301b7e3a4c186283f7fd5a61a395e1" gracePeriod=30 Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.565530 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-config" (OuterVolumeSpecName: "config") pod "892ac2f9-60af-4080-98e4-c4100f33dbc1" (UID: "892ac2f9-60af-4080-98e4-c4100f33dbc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.571338 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-755d469bb8-fhbxg" podUID="00e48c59-681a-4940-a088-c66078a15bb3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": EOF" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.571495 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-755d469bb8-fhbxg" podUID="00e48c59-681a-4940-a088-c66078a15bb3" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": EOF" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.582010 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "892ac2f9-60af-4080-98e4-c4100f33dbc1" (UID: "892ac2f9-60af-4080-98e4-c4100f33dbc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.669866 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "892ac2f9-60af-4080-98e4-c4100f33dbc1" (UID: "892ac2f9-60af-4080-98e4-c4100f33dbc1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.670188 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.670217 4792 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.670228 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/892ac2f9-60af-4080-98e4-c4100f33dbc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.684840 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5b488b889c-ks9th" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.798586 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.874257 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-combined-ca-bundle\") pod \"93e32556-b0df-482d-bb59-090a953b3d26\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.874309 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxh8g\" (UniqueName: \"kubernetes.io/projected/93e32556-b0df-482d-bb59-090a953b3d26-kube-api-access-zxh8g\") pod \"93e32556-b0df-482d-bb59-090a953b3d26\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.874369 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93e32556-b0df-482d-bb59-090a953b3d26-etc-machine-id\") pod \"93e32556-b0df-482d-bb59-090a953b3d26\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.874518 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/93e32556-b0df-482d-bb59-090a953b3d26-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "93e32556-b0df-482d-bb59-090a953b3d26" (UID: "93e32556-b0df-482d-bb59-090a953b3d26"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.874997 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-scripts\") pod \"93e32556-b0df-482d-bb59-090a953b3d26\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.875048 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-config-data-custom\") pod \"93e32556-b0df-482d-bb59-090a953b3d26\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.875153 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-config-data\") pod \"93e32556-b0df-482d-bb59-090a953b3d26\" (UID: \"93e32556-b0df-482d-bb59-090a953b3d26\") " Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.875636 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93e32556-b0df-482d-bb59-090a953b3d26-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.881222 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e32556-b0df-482d-bb59-090a953b3d26-kube-api-access-zxh8g" (OuterVolumeSpecName: "kube-api-access-zxh8g") pod "93e32556-b0df-482d-bb59-090a953b3d26" (UID: "93e32556-b0df-482d-bb59-090a953b3d26"). InnerVolumeSpecName "kube-api-access-zxh8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.882177 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-scripts" (OuterVolumeSpecName: "scripts") pod "93e32556-b0df-482d-bb59-090a953b3d26" (UID: "93e32556-b0df-482d-bb59-090a953b3d26"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.883189 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "93e32556-b0df-482d-bb59-090a953b3d26" (UID: "93e32556-b0df-482d-bb59-090a953b3d26"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.977243 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxh8g\" (UniqueName: \"kubernetes.io/projected/93e32556-b0df-482d-bb59-090a953b3d26-kube-api-access-zxh8g\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.977302 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.977315 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4792]: I0309 09:27:51.988240 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93e32556-b0df-482d-bb59-090a953b3d26" (UID: "93e32556-b0df-482d-bb59-090a953b3d26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.058644 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-config-data" (OuterVolumeSpecName: "config-data") pod "93e32556-b0df-482d-bb59-090a953b3d26" (UID: "93e32556-b0df-482d-bb59-090a953b3d26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.079254 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.079302 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93e32556-b0df-482d-bb59-090a953b3d26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.109274 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c87787d8d-6grm6_892ac2f9-60af-4080-98e4-c4100f33dbc1/neutron-api/0.log" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.109371 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c87787d8d-6grm6" event={"ID":"892ac2f9-60af-4080-98e4-c4100f33dbc1","Type":"ContainerDied","Data":"b7a90ccc2edaff544fc0887adb11913ff45233812d8ddfb5ad4e340ee573044d"} Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.109415 4792 scope.go:117] "RemoveContainer" containerID="5a60069321f9fa357fc9d21c6049df3e4802d3e33056fcac025afbc3369afb2c" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.110829 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c87787d8d-6grm6" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.119030 4792 generic.go:334] "Generic (PLEG): container finished" podID="00e48c59-681a-4940-a088-c66078a15bb3" containerID="e23c1ade70b8647a0748f490afbcf6c7ed21a136db3f14025a113637646e446e" exitCode=143 Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.119124 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-755d469bb8-fhbxg" event={"ID":"00e48c59-681a-4940-a088-c66078a15bb3","Type":"ContainerDied","Data":"e23c1ade70b8647a0748f490afbcf6c7ed21a136db3f14025a113637646e446e"} Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.125855 4792 generic.go:334] "Generic (PLEG): container finished" podID="93e32556-b0df-482d-bb59-090a953b3d26" containerID="a446a75161f526f6a75a533bae01b5f70787846e0c11e6cf9d1ef1fe07705ea0" exitCode=0 Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.125908 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"93e32556-b0df-482d-bb59-090a953b3d26","Type":"ContainerDied","Data":"a446a75161f526f6a75a533bae01b5f70787846e0c11e6cf9d1ef1fe07705ea0"} Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.126222 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"93e32556-b0df-482d-bb59-090a953b3d26","Type":"ContainerDied","Data":"31477fabccbc6a7118d828a23d24ba7eabc685103cc5853239a3bd4ff50cda55"} Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.125963 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.156692 4792 scope.go:117] "RemoveContainer" containerID="fab24a861a59e0830f0752b4396c28e0c1512ff77fcc9d01804f79d24424b223" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.159240 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c87787d8d-6grm6"] Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.193276 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5c87787d8d-6grm6"] Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.250585 4792 scope.go:117] "RemoveContainer" containerID="536c1ede202e7025a154fd71d16278b1f2471d11ca7ff4e2abc2527c2d757086" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.270342 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.303655 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.308947 4792 scope.go:117] "RemoveContainer" containerID="a446a75161f526f6a75a533bae01b5f70787846e0c11e6cf9d1ef1fe07705ea0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.322272 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:27:52 crc kubenswrapper[4792]: E0309 09:27:52.322704 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892ac2f9-60af-4080-98e4-c4100f33dbc1" containerName="neutron-api" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.322725 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="892ac2f9-60af-4080-98e4-c4100f33dbc1" containerName="neutron-api" Mar 09 09:27:52 crc kubenswrapper[4792]: E0309 09:27:52.322748 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e32556-b0df-482d-bb59-090a953b3d26" containerName="cinder-scheduler" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.322756 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e32556-b0df-482d-bb59-090a953b3d26" containerName="cinder-scheduler" Mar 09 09:27:52 crc kubenswrapper[4792]: E0309 09:27:52.322775 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892ac2f9-60af-4080-98e4-c4100f33dbc1" containerName="neutron-httpd" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.322782 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="892ac2f9-60af-4080-98e4-c4100f33dbc1" containerName="neutron-httpd" Mar 09 09:27:52 crc kubenswrapper[4792]: E0309 09:27:52.322793 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8c93ed-5a50-446f-8cb2-73b098410e0a" containerName="dnsmasq-dns" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.322802 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8c93ed-5a50-446f-8cb2-73b098410e0a" containerName="dnsmasq-dns" Mar 09 09:27:52 crc kubenswrapper[4792]: E0309 09:27:52.322817 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e32556-b0df-482d-bb59-090a953b3d26" containerName="probe" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.322823 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e32556-b0df-482d-bb59-090a953b3d26" containerName="probe" Mar 09 09:27:52 crc kubenswrapper[4792]: E0309 09:27:52.322842 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8c93ed-5a50-446f-8cb2-73b098410e0a" containerName="init" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.322849 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8c93ed-5a50-446f-8cb2-73b098410e0a" containerName="init" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.323007 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="892ac2f9-60af-4080-98e4-c4100f33dbc1" containerName="neutron-api" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.323023 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="93e32556-b0df-482d-bb59-090a953b3d26" containerName="probe" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.323037 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="892ac2f9-60af-4080-98e4-c4100f33dbc1" containerName="neutron-httpd" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.323045 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="93e32556-b0df-482d-bb59-090a953b3d26" containerName="cinder-scheduler" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.323055 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8c93ed-5a50-446f-8cb2-73b098410e0a" containerName="dnsmasq-dns" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.324149 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.327522 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.357402 4792 scope.go:117] "RemoveContainer" containerID="536c1ede202e7025a154fd71d16278b1f2471d11ca7ff4e2abc2527c2d757086" Mar 09 09:27:52 crc kubenswrapper[4792]: E0309 09:27:52.359518 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"536c1ede202e7025a154fd71d16278b1f2471d11ca7ff4e2abc2527c2d757086\": container with ID starting with 536c1ede202e7025a154fd71d16278b1f2471d11ca7ff4e2abc2527c2d757086 not found: ID does not exist" containerID="536c1ede202e7025a154fd71d16278b1f2471d11ca7ff4e2abc2527c2d757086" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.359563 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"536c1ede202e7025a154fd71d16278b1f2471d11ca7ff4e2abc2527c2d757086"} err="failed to get container status \"536c1ede202e7025a154fd71d16278b1f2471d11ca7ff4e2abc2527c2d757086\": rpc error: code = NotFound desc = could not find container \"536c1ede202e7025a154fd71d16278b1f2471d11ca7ff4e2abc2527c2d757086\": container with ID starting with 536c1ede202e7025a154fd71d16278b1f2471d11ca7ff4e2abc2527c2d757086 not found: ID does not exist" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.359588 4792 scope.go:117] "RemoveContainer" containerID="a446a75161f526f6a75a533bae01b5f70787846e0c11e6cf9d1ef1fe07705ea0" Mar 09 09:27:52 crc kubenswrapper[4792]: E0309 09:27:52.362583 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a446a75161f526f6a75a533bae01b5f70787846e0c11e6cf9d1ef1fe07705ea0\": container with ID starting with a446a75161f526f6a75a533bae01b5f70787846e0c11e6cf9d1ef1fe07705ea0 not found: ID does not exist" containerID="a446a75161f526f6a75a533bae01b5f70787846e0c11e6cf9d1ef1fe07705ea0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.362656 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a446a75161f526f6a75a533bae01b5f70787846e0c11e6cf9d1ef1fe07705ea0"} err="failed to get container status \"a446a75161f526f6a75a533bae01b5f70787846e0c11e6cf9d1ef1fe07705ea0\": rpc error: code = NotFound desc = could not find container \"a446a75161f526f6a75a533bae01b5f70787846e0c11e6cf9d1ef1fe07705ea0\": container with ID starting with a446a75161f526f6a75a533bae01b5f70787846e0c11e6cf9d1ef1fe07705ea0 not found: ID does not exist" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.362487 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.383985 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8c68c2d-fe77-41af-b4f4-8f83079bf316-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.384047 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8c68c2d-fe77-41af-b4f4-8f83079bf316-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.384095 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8c68c2d-fe77-41af-b4f4-8f83079bf316-config-data\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.384116 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dnsn\" (UniqueName: \"kubernetes.io/projected/c8c68c2d-fe77-41af-b4f4-8f83079bf316-kube-api-access-9dnsn\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.384181 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8c68c2d-fe77-41af-b4f4-8f83079bf316-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.384230 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8c68c2d-fe77-41af-b4f4-8f83079bf316-scripts\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.485327 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8c68c2d-fe77-41af-b4f4-8f83079bf316-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.485429 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8c68c2d-fe77-41af-b4f4-8f83079bf316-scripts\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.485463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8c68c2d-fe77-41af-b4f4-8f83079bf316-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.485506 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8c68c2d-fe77-41af-b4f4-8f83079bf316-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.485552 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8c68c2d-fe77-41af-b4f4-8f83079bf316-config-data\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.485576 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dnsn\" (UniqueName: \"kubernetes.io/projected/c8c68c2d-fe77-41af-b4f4-8f83079bf316-kube-api-access-9dnsn\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.490036 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8c68c2d-fe77-41af-b4f4-8f83079bf316-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.492630 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8c68c2d-fe77-41af-b4f4-8f83079bf316-scripts\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.494643 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8c68c2d-fe77-41af-b4f4-8f83079bf316-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.505751 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dnsn\" (UniqueName: \"kubernetes.io/projected/c8c68c2d-fe77-41af-b4f4-8f83079bf316-kube-api-access-9dnsn\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.507924 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8c68c2d-fe77-41af-b4f4-8f83079bf316-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.516943 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8c68c2d-fe77-41af-b4f4-8f83079bf316-config-data\") pod \"cinder-scheduler-0\" (UID: \"c8c68c2d-fe77-41af-b4f4-8f83079bf316\") " pod="openstack/cinder-scheduler-0" Mar 09 09:27:52 crc kubenswrapper[4792]: I0309 09:27:52.659290 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.154911 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.162120 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.168994 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.169440 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.170326 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-22zmk" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.246200 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.306823 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91f677a-9bb7-4884-a46b-f37296e4454f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.307134 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b91f677a-9bb7-4884-a46b-f37296e4454f-openstack-config\") pod \"openstackclient\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.307393 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scqkv\" (UniqueName: \"kubernetes.io/projected/b91f677a-9bb7-4884-a46b-f37296e4454f-kube-api-access-scqkv\") pod \"openstackclient\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.307492 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b91f677a-9bb7-4884-a46b-f37296e4454f-openstack-config-secret\") pod \"openstackclient\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.315575 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.409736 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b91f677a-9bb7-4884-a46b-f37296e4454f-openstack-config\") pod \"openstackclient\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.409873 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scqkv\" (UniqueName: \"kubernetes.io/projected/b91f677a-9bb7-4884-a46b-f37296e4454f-kube-api-access-scqkv\") pod \"openstackclient\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.409905 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b91f677a-9bb7-4884-a46b-f37296e4454f-openstack-config-secret\") pod \"openstackclient\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.409953 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91f677a-9bb7-4884-a46b-f37296e4454f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.411059 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b91f677a-9bb7-4884-a46b-f37296e4454f-openstack-config\") pod \"openstackclient\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.421376 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91f677a-9bb7-4884-a46b-f37296e4454f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.425598 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b91f677a-9bb7-4884-a46b-f37296e4454f-openstack-config-secret\") pod \"openstackclient\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.431417 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scqkv\" (UniqueName: \"kubernetes.io/projected/b91f677a-9bb7-4884-a46b-f37296e4454f-kube-api-access-scqkv\") pod \"openstackclient\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.500150 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.523635 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.536607 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.573610 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.574872 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.610433 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.687435 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="892ac2f9-60af-4080-98e4-c4100f33dbc1" path="/var/lib/kubelet/pods/892ac2f9-60af-4080-98e4-c4100f33dbc1/volumes" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.690650 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e32556-b0df-482d-bb59-090a953b3d26" path="/var/lib/kubelet/pods/93e32556-b0df-482d-bb59-090a953b3d26/volumes" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.715178 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09fc64e5-4201-410d-a764-789e1dc85ac0-openstack-config-secret\") pod \"openstackclient\" (UID: \"09fc64e5-4201-410d-a764-789e1dc85ac0\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.715277 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09fc64e5-4201-410d-a764-789e1dc85ac0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"09fc64e5-4201-410d-a764-789e1dc85ac0\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.715314 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtrmr\" (UniqueName: \"kubernetes.io/projected/09fc64e5-4201-410d-a764-789e1dc85ac0-kube-api-access-qtrmr\") pod \"openstackclient\" (UID: \"09fc64e5-4201-410d-a764-789e1dc85ac0\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.715354 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09fc64e5-4201-410d-a764-789e1dc85ac0-openstack-config\") pod \"openstackclient\" (UID: \"09fc64e5-4201-410d-a764-789e1dc85ac0\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.820684 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09fc64e5-4201-410d-a764-789e1dc85ac0-openstack-config-secret\") pod \"openstackclient\" (UID: \"09fc64e5-4201-410d-a764-789e1dc85ac0\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.820826 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09fc64e5-4201-410d-a764-789e1dc85ac0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"09fc64e5-4201-410d-a764-789e1dc85ac0\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.820867 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtrmr\" (UniqueName: \"kubernetes.io/projected/09fc64e5-4201-410d-a764-789e1dc85ac0-kube-api-access-qtrmr\") pod \"openstackclient\" (UID: \"09fc64e5-4201-410d-a764-789e1dc85ac0\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.820911 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09fc64e5-4201-410d-a764-789e1dc85ac0-openstack-config\") pod \"openstackclient\" (UID: \"09fc64e5-4201-410d-a764-789e1dc85ac0\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.821969 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/09fc64e5-4201-410d-a764-789e1dc85ac0-openstack-config\") pod \"openstackclient\" (UID: \"09fc64e5-4201-410d-a764-789e1dc85ac0\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.828614 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/09fc64e5-4201-410d-a764-789e1dc85ac0-openstack-config-secret\") pod \"openstackclient\" (UID: \"09fc64e5-4201-410d-a764-789e1dc85ac0\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.829360 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09fc64e5-4201-410d-a764-789e1dc85ac0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"09fc64e5-4201-410d-a764-789e1dc85ac0\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.840199 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtrmr\" (UniqueName: \"kubernetes.io/projected/09fc64e5-4201-410d-a764-789e1dc85ac0-kube-api-access-qtrmr\") pod \"openstackclient\" (UID: \"09fc64e5-4201-410d-a764-789e1dc85ac0\") " pod="openstack/openstackclient" Mar 09 09:27:53 crc kubenswrapper[4792]: I0309 09:27:53.945233 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 09:27:54 crc kubenswrapper[4792]: E0309 09:27:54.047430 4792 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 09 09:27:54 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_b91f677a-9bb7-4884-a46b-f37296e4454f_0(15cf53b0520e12fe25e5efca054c496cb827e2243a2c0dda66ccf47645374dc7): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"15cf53b0520e12fe25e5efca054c496cb827e2243a2c0dda66ccf47645374dc7" Netns:"/var/run/netns/c0fc95e1-f24c-4272-8391-ffdd8184c639" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=15cf53b0520e12fe25e5efca054c496cb827e2243a2c0dda66ccf47645374dc7;K8S_POD_UID=b91f677a-9bb7-4884-a46b-f37296e4454f" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/b91f677a-9bb7-4884-a46b-f37296e4454f]: expected pod UID "b91f677a-9bb7-4884-a46b-f37296e4454f" but got "09fc64e5-4201-410d-a764-789e1dc85ac0" from Kube API Mar 09 09:27:54 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 09 09:27:54 crc kubenswrapper[4792]: > Mar 09 09:27:54 crc kubenswrapper[4792]: E0309 09:27:54.047518 4792 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 09 09:27:54 crc kubenswrapper[4792]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_b91f677a-9bb7-4884-a46b-f37296e4454f_0(15cf53b0520e12fe25e5efca054c496cb827e2243a2c0dda66ccf47645374dc7): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"15cf53b0520e12fe25e5efca054c496cb827e2243a2c0dda66ccf47645374dc7" Netns:"/var/run/netns/c0fc95e1-f24c-4272-8391-ffdd8184c639" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=15cf53b0520e12fe25e5efca054c496cb827e2243a2c0dda66ccf47645374dc7;K8S_POD_UID=b91f677a-9bb7-4884-a46b-f37296e4454f" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/b91f677a-9bb7-4884-a46b-f37296e4454f]: expected pod UID "b91f677a-9bb7-4884-a46b-f37296e4454f" but got "09fc64e5-4201-410d-a764-789e1dc85ac0" from Kube API Mar 09 09:27:54 crc kubenswrapper[4792]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 09 09:27:54 crc kubenswrapper[4792]: > pod="openstack/openstackclient" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.073168 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.124851 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-public-tls-certs\") pod \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.124898 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-combined-ca-bundle\") pod \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.124950 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-config-data\") pod \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.124993 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecc21e88-9435-44e8-9fae-4838ae5e46ce-logs\") pod \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.125019 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-scripts\") pod \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.125051 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rvjj\" (UniqueName: \"kubernetes.io/projected/ecc21e88-9435-44e8-9fae-4838ae5e46ce-kube-api-access-6rvjj\") pod \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.125158 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-internal-tls-certs\") pod \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\" (UID: \"ecc21e88-9435-44e8-9fae-4838ae5e46ce\") " Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.142353 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecc21e88-9435-44e8-9fae-4838ae5e46ce-logs" (OuterVolumeSpecName: "logs") pod "ecc21e88-9435-44e8-9fae-4838ae5e46ce" (UID: "ecc21e88-9435-44e8-9fae-4838ae5e46ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.161635 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-scripts" (OuterVolumeSpecName: "scripts") pod "ecc21e88-9435-44e8-9fae-4838ae5e46ce" (UID: "ecc21e88-9435-44e8-9fae-4838ae5e46ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.163375 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc21e88-9435-44e8-9fae-4838ae5e46ce-kube-api-access-6rvjj" (OuterVolumeSpecName: "kube-api-access-6rvjj") pod "ecc21e88-9435-44e8-9fae-4838ae5e46ce" (UID: "ecc21e88-9435-44e8-9fae-4838ae5e46ce"). InnerVolumeSpecName "kube-api-access-6rvjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.195920 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c8c68c2d-fe77-41af-b4f4-8f83079bf316","Type":"ContainerStarted","Data":"5ce0e64bd17820dc7743d2e079eac780551aa57c08738bc031d3ff07eb315b54"} Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.202710 4792 generic.go:334] "Generic (PLEG): container finished" podID="ecc21e88-9435-44e8-9fae-4838ae5e46ce" containerID="02eafe46662cfecf14424d3472d3bced8851ad529264bdac4f5ef7f030a9f962" exitCode=0 Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.202791 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.203427 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b89dcf9c6-x4cgw" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.203869 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b89dcf9c6-x4cgw" event={"ID":"ecc21e88-9435-44e8-9fae-4838ae5e46ce","Type":"ContainerDied","Data":"02eafe46662cfecf14424d3472d3bced8851ad529264bdac4f5ef7f030a9f962"} Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.203909 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b89dcf9c6-x4cgw" event={"ID":"ecc21e88-9435-44e8-9fae-4838ae5e46ce","Type":"ContainerDied","Data":"de44807cdd13b5a9e4b86d076d4d2049fa9ae0044927489327ef71a9f9176c60"} Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.203930 4792 scope.go:117] "RemoveContainer" containerID="02eafe46662cfecf14424d3472d3bced8851ad529264bdac4f5ef7f030a9f962" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.227465 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecc21e88-9435-44e8-9fae-4838ae5e46ce-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.227492 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.227505 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rvjj\" (UniqueName: \"kubernetes.io/projected/ecc21e88-9435-44e8-9fae-4838ae5e46ce-kube-api-access-6rvjj\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.292275 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.296185 4792 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b91f677a-9bb7-4884-a46b-f37296e4454f" podUID="09fc64e5-4201-410d-a764-789e1dc85ac0" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.351260 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-config-data" (OuterVolumeSpecName: "config-data") pod "ecc21e88-9435-44e8-9fae-4838ae5e46ce" (UID: "ecc21e88-9435-44e8-9fae-4838ae5e46ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.394480 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecc21e88-9435-44e8-9fae-4838ae5e46ce" (UID: "ecc21e88-9435-44e8-9fae-4838ae5e46ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.437662 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scqkv\" (UniqueName: \"kubernetes.io/projected/b91f677a-9bb7-4884-a46b-f37296e4454f-kube-api-access-scqkv\") pod \"b91f677a-9bb7-4884-a46b-f37296e4454f\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.437790 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91f677a-9bb7-4884-a46b-f37296e4454f-combined-ca-bundle\") pod \"b91f677a-9bb7-4884-a46b-f37296e4454f\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.437840 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b91f677a-9bb7-4884-a46b-f37296e4454f-openstack-config\") pod \"b91f677a-9bb7-4884-a46b-f37296e4454f\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.437882 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b91f677a-9bb7-4884-a46b-f37296e4454f-openstack-config-secret\") pod \"b91f677a-9bb7-4884-a46b-f37296e4454f\" (UID: \"b91f677a-9bb7-4884-a46b-f37296e4454f\") " Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.438423 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.438436 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.447062 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91f677a-9bb7-4884-a46b-f37296e4454f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b91f677a-9bb7-4884-a46b-f37296e4454f" (UID: "b91f677a-9bb7-4884-a46b-f37296e4454f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.513238 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91f677a-9bb7-4884-a46b-f37296e4454f-kube-api-access-scqkv" (OuterVolumeSpecName: "kube-api-access-scqkv") pod "b91f677a-9bb7-4884-a46b-f37296e4454f" (UID: "b91f677a-9bb7-4884-a46b-f37296e4454f"). InnerVolumeSpecName "kube-api-access-scqkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.513351 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91f677a-9bb7-4884-a46b-f37296e4454f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b91f677a-9bb7-4884-a46b-f37296e4454f" (UID: "b91f677a-9bb7-4884-a46b-f37296e4454f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.513423 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b91f677a-9bb7-4884-a46b-f37296e4454f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b91f677a-9bb7-4884-a46b-f37296e4454f" (UID: "b91f677a-9bb7-4884-a46b-f37296e4454f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.513479 4792 scope.go:117] "RemoveContainer" containerID="70031f87f3291e5b64587bd61d4148f9975d8f1294b6e77ac12d0bc16de291bc" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.520978 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ecc21e88-9435-44e8-9fae-4838ae5e46ce" (UID: "ecc21e88-9435-44e8-9fae-4838ae5e46ce"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.543358 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.543411 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scqkv\" (UniqueName: \"kubernetes.io/projected/b91f677a-9bb7-4884-a46b-f37296e4454f-kube-api-access-scqkv\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.543425 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b91f677a-9bb7-4884-a46b-f37296e4454f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.543436 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b91f677a-9bb7-4884-a46b-f37296e4454f-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.543448 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b91f677a-9bb7-4884-a46b-f37296e4454f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.593113 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ecc21e88-9435-44e8-9fae-4838ae5e46ce" (UID: "ecc21e88-9435-44e8-9fae-4838ae5e46ce"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.647707 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc21e88-9435-44e8-9fae-4838ae5e46ce-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.667041 4792 scope.go:117] "RemoveContainer" containerID="02eafe46662cfecf14424d3472d3bced8851ad529264bdac4f5ef7f030a9f962" Mar 09 09:27:54 crc kubenswrapper[4792]: E0309 09:27:54.667756 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02eafe46662cfecf14424d3472d3bced8851ad529264bdac4f5ef7f030a9f962\": container with ID starting with 02eafe46662cfecf14424d3472d3bced8851ad529264bdac4f5ef7f030a9f962 not found: ID does not exist" containerID="02eafe46662cfecf14424d3472d3bced8851ad529264bdac4f5ef7f030a9f962" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.667800 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02eafe46662cfecf14424d3472d3bced8851ad529264bdac4f5ef7f030a9f962"} err="failed to get container status \"02eafe46662cfecf14424d3472d3bced8851ad529264bdac4f5ef7f030a9f962\": rpc error: code = NotFound desc = could not find container \"02eafe46662cfecf14424d3472d3bced8851ad529264bdac4f5ef7f030a9f962\": container with ID starting with 02eafe46662cfecf14424d3472d3bced8851ad529264bdac4f5ef7f030a9f962 not found: ID does not exist" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.667820 4792 scope.go:117] "RemoveContainer" containerID="70031f87f3291e5b64587bd61d4148f9975d8f1294b6e77ac12d0bc16de291bc" Mar 09 09:27:54 crc kubenswrapper[4792]: E0309 09:27:54.668993 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70031f87f3291e5b64587bd61d4148f9975d8f1294b6e77ac12d0bc16de291bc\": container with ID starting with 70031f87f3291e5b64587bd61d4148f9975d8f1294b6e77ac12d0bc16de291bc not found: ID does not exist" containerID="70031f87f3291e5b64587bd61d4148f9975d8f1294b6e77ac12d0bc16de291bc" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.669014 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70031f87f3291e5b64587bd61d4148f9975d8f1294b6e77ac12d0bc16de291bc"} err="failed to get container status \"70031f87f3291e5b64587bd61d4148f9975d8f1294b6e77ac12d0bc16de291bc\": rpc error: code = NotFound desc = could not find container \"70031f87f3291e5b64587bd61d4148f9975d8f1294b6e77ac12d0bc16de291bc\": container with ID starting with 70031f87f3291e5b64587bd61d4148f9975d8f1294b6e77ac12d0bc16de291bc not found: ID does not exist" Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.813128 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.865771 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7b89dcf9c6-x4cgw"] Mar 09 09:27:54 crc kubenswrapper[4792]: I0309 09:27:54.874429 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7b89dcf9c6-x4cgw"] Mar 09 09:27:55 crc kubenswrapper[4792]: I0309 09:27:55.214701 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"09fc64e5-4201-410d-a764-789e1dc85ac0","Type":"ContainerStarted","Data":"ece83b75671c3b0bb11a2acd1e879924e7a77bced65ef6481b6853d37f5981cc"} Mar 09 09:27:55 crc kubenswrapper[4792]: I0309 09:27:55.216132 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 09:27:55 crc kubenswrapper[4792]: I0309 09:27:55.220668 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c8c68c2d-fe77-41af-b4f4-8f83079bf316","Type":"ContainerStarted","Data":"3ae5588f1b44df3d07654a99c65272d72574ea2d960ce09a5f22f4ef6b0adb98"} Mar 09 09:27:55 crc kubenswrapper[4792]: I0309 09:27:55.229906 4792 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b91f677a-9bb7-4884-a46b-f37296e4454f" podUID="09fc64e5-4201-410d-a764-789e1dc85ac0" Mar 09 09:27:55 crc kubenswrapper[4792]: I0309 09:27:55.722309 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b91f677a-9bb7-4884-a46b-f37296e4454f" path="/var/lib/kubelet/pods/b91f677a-9bb7-4884-a46b-f37296e4454f/volumes" Mar 09 09:27:55 crc kubenswrapper[4792]: I0309 09:27:55.723377 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc21e88-9435-44e8-9fae-4838ae5e46ce" path="/var/lib/kubelet/pods/ecc21e88-9435-44e8-9fae-4838ae5e46ce/volumes" Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.157128 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-755d469bb8-fhbxg" podUID="00e48c59-681a-4940-a088-c66078a15bb3" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:39006->10.217.0.153:9311: read: connection reset by peer" Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.157697 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-755d469bb8-fhbxg" podUID="00e48c59-681a-4940-a088-c66078a15bb3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:38994->10.217.0.153:9311: read: connection reset by peer" Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.229865 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c8c68c2d-fe77-41af-b4f4-8f83079bf316","Type":"ContainerStarted","Data":"c09257f64fe0284b27b17e062b34e05bec29ae58ebaaacff39d2e22de2e36807"} Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.238556 4792 generic.go:334] "Generic (PLEG): container finished" podID="00e48c59-681a-4940-a088-c66078a15bb3" containerID="f24152846dc01f72d7db598e8a5a5ddcae301b7e3a4c186283f7fd5a61a395e1" exitCode=0 Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.238606 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-755d469bb8-fhbxg" event={"ID":"00e48c59-681a-4940-a088-c66078a15bb3","Type":"ContainerDied","Data":"f24152846dc01f72d7db598e8a5a5ddcae301b7e3a4c186283f7fd5a61a395e1"} Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.259545 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.259522275 podStartE2EDuration="4.259522275s" podCreationTimestamp="2026-03-09 09:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:56.251595875 +0000 UTC m=+1241.281796627" watchObservedRunningTime="2026-03-09 09:27:56.259522275 +0000 UTC m=+1241.289723027" Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.684147 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.716002 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.847828 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-config-data-custom\") pod \"00e48c59-681a-4940-a088-c66078a15bb3\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.847980 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-combined-ca-bundle\") pod \"00e48c59-681a-4940-a088-c66078a15bb3\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.848020 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkzhd\" (UniqueName: \"kubernetes.io/projected/00e48c59-681a-4940-a088-c66078a15bb3-kube-api-access-kkzhd\") pod \"00e48c59-681a-4940-a088-c66078a15bb3\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.848144 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00e48c59-681a-4940-a088-c66078a15bb3-logs\") pod \"00e48c59-681a-4940-a088-c66078a15bb3\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.848195 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-config-data\") pod \"00e48c59-681a-4940-a088-c66078a15bb3\" (UID: \"00e48c59-681a-4940-a088-c66078a15bb3\") " Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.850282 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00e48c59-681a-4940-a088-c66078a15bb3-logs" (OuterVolumeSpecName: "logs") pod "00e48c59-681a-4940-a088-c66078a15bb3" (UID: "00e48c59-681a-4940-a088-c66078a15bb3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.868208 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "00e48c59-681a-4940-a088-c66078a15bb3" (UID: "00e48c59-681a-4940-a088-c66078a15bb3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.895443 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00e48c59-681a-4940-a088-c66078a15bb3-kube-api-access-kkzhd" (OuterVolumeSpecName: "kube-api-access-kkzhd") pod "00e48c59-681a-4940-a088-c66078a15bb3" (UID: "00e48c59-681a-4940-a088-c66078a15bb3"). InnerVolumeSpecName "kube-api-access-kkzhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.958133 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkzhd\" (UniqueName: \"kubernetes.io/projected/00e48c59-681a-4940-a088-c66078a15bb3-kube-api-access-kkzhd\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.958173 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00e48c59-681a-4940-a088-c66078a15bb3-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.958185 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:56 crc kubenswrapper[4792]: I0309 09:27:56.958400 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00e48c59-681a-4940-a088-c66078a15bb3" (UID: "00e48c59-681a-4940-a088-c66078a15bb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:57 crc kubenswrapper[4792]: I0309 09:27:57.023928 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-config-data" (OuterVolumeSpecName: "config-data") pod "00e48c59-681a-4940-a088-c66078a15bb3" (UID: "00e48c59-681a-4940-a088-c66078a15bb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:57 crc kubenswrapper[4792]: I0309 09:27:57.060308 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:57 crc kubenswrapper[4792]: I0309 09:27:57.060339 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e48c59-681a-4940-a088-c66078a15bb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:57 crc kubenswrapper[4792]: I0309 09:27:57.261102 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-755d469bb8-fhbxg" Mar 09 09:27:57 crc kubenswrapper[4792]: I0309 09:27:57.261260 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-755d469bb8-fhbxg" event={"ID":"00e48c59-681a-4940-a088-c66078a15bb3","Type":"ContainerDied","Data":"8f61f3604571733d087e67699948ee5771395b61a67f15ab18c9793f3b69bf65"} Mar 09 09:27:57 crc kubenswrapper[4792]: I0309 09:27:57.261315 4792 scope.go:117] "RemoveContainer" containerID="f24152846dc01f72d7db598e8a5a5ddcae301b7e3a4c186283f7fd5a61a395e1" Mar 09 09:27:57 crc kubenswrapper[4792]: I0309 09:27:57.304157 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-755d469bb8-fhbxg"] Mar 09 09:27:57 crc kubenswrapper[4792]: I0309 09:27:57.311772 4792 scope.go:117] "RemoveContainer" containerID="e23c1ade70b8647a0748f490afbcf6c7ed21a136db3f14025a113637646e446e" Mar 09 09:27:57 crc kubenswrapper[4792]: I0309 09:27:57.318418 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-755d469bb8-fhbxg"] Mar 09 09:27:57 crc kubenswrapper[4792]: I0309 09:27:57.660188 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 09:27:57 crc kubenswrapper[4792]: I0309 09:27:57.678399 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00e48c59-681a-4940-a088-c66078a15bb3" path="/var/lib/kubelet/pods/00e48c59-681a-4940-a088-c66078a15bb3/volumes" Mar 09 09:27:58 crc kubenswrapper[4792]: I0309 09:27:58.561294 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.295166 4792 generic.go:334] "Generic (PLEG): container finished" podID="4a445589-e577-4db2-a287-c4c378a13030" containerID="d8e4f48a3f8d48b098af90832eb2aed9d3856576ca12155e6cda43293deec58b" exitCode=0 Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.295495 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f75f4656d-dmz5s" event={"ID":"4a445589-e577-4db2-a287-c4c378a13030","Type":"ContainerDied","Data":"d8e4f48a3f8d48b098af90832eb2aed9d3856576ca12155e6cda43293deec58b"} Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.415447 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.518397 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-ovndb-tls-certs\") pod \"4a445589-e577-4db2-a287-c4c378a13030\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.518616 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-httpd-config\") pod \"4a445589-e577-4db2-a287-c4c378a13030\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.518649 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-config\") pod \"4a445589-e577-4db2-a287-c4c378a13030\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.518699 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv8wm\" (UniqueName: \"kubernetes.io/projected/4a445589-e577-4db2-a287-c4c378a13030-kube-api-access-qv8wm\") pod \"4a445589-e577-4db2-a287-c4c378a13030\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.518754 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-combined-ca-bundle\") pod \"4a445589-e577-4db2-a287-c4c378a13030\" (UID: \"4a445589-e577-4db2-a287-c4c378a13030\") " Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.549438 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4a445589-e577-4db2-a287-c4c378a13030" (UID: "4a445589-e577-4db2-a287-c4c378a13030"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.589265 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a445589-e577-4db2-a287-c4c378a13030-kube-api-access-qv8wm" (OuterVolumeSpecName: "kube-api-access-qv8wm") pod "4a445589-e577-4db2-a287-c4c378a13030" (UID: "4a445589-e577-4db2-a287-c4c378a13030"). InnerVolumeSpecName "kube-api-access-qv8wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.623347 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.623387 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv8wm\" (UniqueName: \"kubernetes.io/projected/4a445589-e577-4db2-a287-c4c378a13030-kube-api-access-qv8wm\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.633857 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-config" (OuterVolumeSpecName: "config") pod "4a445589-e577-4db2-a287-c4c378a13030" (UID: "4a445589-e577-4db2-a287-c4c378a13030"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.690790 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a445589-e577-4db2-a287-c4c378a13030" (UID: "4a445589-e577-4db2-a287-c4c378a13030"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.692689 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4a445589-e577-4db2-a287-c4c378a13030" (UID: "4a445589-e577-4db2-a287-c4c378a13030"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.724618 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.724658 4792 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:59 crc kubenswrapper[4792]: I0309 09:27:59.724669 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a445589-e577-4db2-a287-c4c378a13030-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.140201 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550808-tq64m"] Mar 09 09:28:00 crc kubenswrapper[4792]: E0309 09:28:00.150950 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc21e88-9435-44e8-9fae-4838ae5e46ce" containerName="placement-log" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.150986 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc21e88-9435-44e8-9fae-4838ae5e46ce" containerName="placement-log" Mar 09 09:28:00 crc kubenswrapper[4792]: E0309 09:28:00.151007 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a445589-e577-4db2-a287-c4c378a13030" containerName="neutron-api" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.151015 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a445589-e577-4db2-a287-c4c378a13030" containerName="neutron-api" Mar 09 09:28:00 crc kubenswrapper[4792]: E0309 09:28:00.151025 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a445589-e577-4db2-a287-c4c378a13030" containerName="neutron-httpd" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.151031 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a445589-e577-4db2-a287-c4c378a13030" containerName="neutron-httpd" Mar 09 09:28:00 crc kubenswrapper[4792]: E0309 09:28:00.151045 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00e48c59-681a-4940-a088-c66078a15bb3" containerName="barbican-api" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.151053 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="00e48c59-681a-4940-a088-c66078a15bb3" containerName="barbican-api" Mar 09 09:28:00 crc kubenswrapper[4792]: E0309 09:28:00.151085 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00e48c59-681a-4940-a088-c66078a15bb3" containerName="barbican-api-log" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.151091 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="00e48c59-681a-4940-a088-c66078a15bb3" containerName="barbican-api-log" Mar 09 09:28:00 crc kubenswrapper[4792]: E0309 09:28:00.151103 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc21e88-9435-44e8-9fae-4838ae5e46ce" containerName="placement-api" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.151109 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc21e88-9435-44e8-9fae-4838ae5e46ce" containerName="placement-api" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.151266 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc21e88-9435-44e8-9fae-4838ae5e46ce" containerName="placement-api" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.151281 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a445589-e577-4db2-a287-c4c378a13030" containerName="neutron-api" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.151298 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="00e48c59-681a-4940-a088-c66078a15bb3" containerName="barbican-api" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.151307 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="00e48c59-681a-4940-a088-c66078a15bb3" containerName="barbican-api-log" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.151320 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a445589-e577-4db2-a287-c4c378a13030" containerName="neutron-httpd" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.151335 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc21e88-9435-44e8-9fae-4838ae5e46ce" containerName="placement-log" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.151951 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550808-tq64m"] Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.152057 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550808-tq64m" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.158600 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.158682 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.158621 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.232329 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw8rr\" (UniqueName: \"kubernetes.io/projected/a78e893e-c5f1-4524-9d89-39624fbaec63-kube-api-access-pw8rr\") pod \"auto-csr-approver-29550808-tq64m\" (UID: \"a78e893e-c5f1-4524-9d89-39624fbaec63\") " pod="openshift-infra/auto-csr-approver-29550808-tq64m" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.308759 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f75f4656d-dmz5s" event={"ID":"4a445589-e577-4db2-a287-c4c378a13030","Type":"ContainerDied","Data":"7219fd819421db554ed2048138ef5e1d552e4486a049eb027b18480106a62459"} Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.308819 4792 scope.go:117] "RemoveContainer" containerID="3364f4b684f8772e3c518e60ae97f01621a6ae4977b748ac643f0e5ef03810d7" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.309005 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f75f4656d-dmz5s" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.333529 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw8rr\" (UniqueName: \"kubernetes.io/projected/a78e893e-c5f1-4524-9d89-39624fbaec63-kube-api-access-pw8rr\") pod \"auto-csr-approver-29550808-tq64m\" (UID: \"a78e893e-c5f1-4524-9d89-39624fbaec63\") " pod="openshift-infra/auto-csr-approver-29550808-tq64m" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.354222 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f75f4656d-dmz5s"] Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.362365 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw8rr\" (UniqueName: \"kubernetes.io/projected/a78e893e-c5f1-4524-9d89-39624fbaec63-kube-api-access-pw8rr\") pod \"auto-csr-approver-29550808-tq64m\" (UID: \"a78e893e-c5f1-4524-9d89-39624fbaec63\") " pod="openshift-infra/auto-csr-approver-29550808-tq64m" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.368377 4792 scope.go:117] "RemoveContainer" containerID="d8e4f48a3f8d48b098af90832eb2aed9d3856576ca12155e6cda43293deec58b" Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.375483 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7f75f4656d-dmz5s"] Mar 09 09:28:00 crc kubenswrapper[4792]: I0309 09:28:00.472217 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550808-tq64m" Mar 09 09:28:01 crc kubenswrapper[4792]: I0309 09:28:01.037852 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550808-tq64m"] Mar 09 09:28:01 crc kubenswrapper[4792]: W0309 09:28:01.048275 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda78e893e_c5f1_4524_9d89_39624fbaec63.slice/crio-4e23c91ad32af8cd6b453d1606e7b16049780b24ac2db8370914ed0c0d563c47 WatchSource:0}: Error finding container 4e23c91ad32af8cd6b453d1606e7b16049780b24ac2db8370914ed0c0d563c47: Status 404 returned error can't find the container with id 4e23c91ad32af8cd6b453d1606e7b16049780b24ac2db8370914ed0c0d563c47 Mar 09 09:28:01 crc kubenswrapper[4792]: I0309 09:28:01.321949 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550808-tq64m" event={"ID":"a78e893e-c5f1-4524-9d89-39624fbaec63","Type":"ContainerStarted","Data":"4e23c91ad32af8cd6b453d1606e7b16049780b24ac2db8370914ed0c0d563c47"} Mar 09 09:28:01 crc kubenswrapper[4792]: I0309 09:28:01.676141 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a445589-e577-4db2-a287-c4c378a13030" path="/var/lib/kubelet/pods/4a445589-e577-4db2-a287-c4c378a13030/volumes" Mar 09 09:28:02 crc kubenswrapper[4792]: I0309 09:28:02.345499 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:02 crc kubenswrapper[4792]: I0309 09:28:02.346049 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="ceilometer-central-agent" containerID="cri-o://26de45986757fcb8064c19b2c93d3222296aad85238631a200c492e107a4ad75" gracePeriod=30 Mar 09 09:28:02 crc kubenswrapper[4792]: I0309 09:28:02.346487 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="proxy-httpd" containerID="cri-o://84e0df3f63ac0b463918173b2faafdefac74af5fb2dcfe9081d17d537acbc6ff" gracePeriod=30 Mar 09 09:28:02 crc kubenswrapper[4792]: I0309 09:28:02.346539 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="sg-core" containerID="cri-o://2de19365c617c18e8ca70fd82e0b913fa43ed4fa2263c3bfe6c92229347f6be9" gracePeriod=30 Mar 09 09:28:02 crc kubenswrapper[4792]: I0309 09:28:02.346578 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="ceilometer-notification-agent" containerID="cri-o://a1defc8ff6c6464e89e8fd4d8f496ae6d26672089dc614aae2dbc712eec7e8b0" gracePeriod=30 Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.001296 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.381284 4792 generic.go:334] "Generic (PLEG): container finished" podID="a78e893e-c5f1-4524-9d89-39624fbaec63" containerID="392bee4f37cc0a3737f5478d6db8de17174299f826a2b1710fcd79aae2bc9cd1" exitCode=0 Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.381390 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550808-tq64m" event={"ID":"a78e893e-c5f1-4524-9d89-39624fbaec63","Type":"ContainerDied","Data":"392bee4f37cc0a3737f5478d6db8de17174299f826a2b1710fcd79aae2bc9cd1"} Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.398367 4792 generic.go:334] "Generic (PLEG): container finished" podID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerID="84e0df3f63ac0b463918173b2faafdefac74af5fb2dcfe9081d17d537acbc6ff" exitCode=0 Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.398404 4792 generic.go:334] "Generic (PLEG): container finished" podID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerID="2de19365c617c18e8ca70fd82e0b913fa43ed4fa2263c3bfe6c92229347f6be9" exitCode=2 Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.398413 4792 generic.go:334] "Generic (PLEG): container finished" podID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerID="a1defc8ff6c6464e89e8fd4d8f496ae6d26672089dc614aae2dbc712eec7e8b0" exitCode=0 Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.398422 4792 generic.go:334] "Generic (PLEG): container finished" podID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerID="26de45986757fcb8064c19b2c93d3222296aad85238631a200c492e107a4ad75" exitCode=0 Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.398449 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07b74f-754c-43a7-97fb-a0fccb9c5df4","Type":"ContainerDied","Data":"84e0df3f63ac0b463918173b2faafdefac74af5fb2dcfe9081d17d537acbc6ff"} Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.398479 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07b74f-754c-43a7-97fb-a0fccb9c5df4","Type":"ContainerDied","Data":"2de19365c617c18e8ca70fd82e0b913fa43ed4fa2263c3bfe6c92229347f6be9"} Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.398491 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07b74f-754c-43a7-97fb-a0fccb9c5df4","Type":"ContainerDied","Data":"a1defc8ff6c6464e89e8fd4d8f496ae6d26672089dc614aae2dbc712eec7e8b0"} Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.398501 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07b74f-754c-43a7-97fb-a0fccb9c5df4","Type":"ContainerDied","Data":"26de45986757fcb8064c19b2c93d3222296aad85238631a200c492e107a4ad75"} Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.462989 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zrzlq"] Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.470603 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zrzlq" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.475511 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zrzlq"] Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.505295 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qgvl\" (UniqueName: \"kubernetes.io/projected/3ce1d921-17fb-427e-bc2b-9df3487b0e5b-kube-api-access-7qgvl\") pod \"nova-api-db-create-zrzlq\" (UID: \"3ce1d921-17fb-427e-bc2b-9df3487b0e5b\") " pod="openstack/nova-api-db-create-zrzlq" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.505641 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce1d921-17fb-427e-bc2b-9df3487b0e5b-operator-scripts\") pod \"nova-api-db-create-zrzlq\" (UID: \"3ce1d921-17fb-427e-bc2b-9df3487b0e5b\") " pod="openstack/nova-api-db-create-zrzlq" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.577191 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-8785d"] Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.578956 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8785d" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.597256 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8785d"] Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.616755 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qgvl\" (UniqueName: \"kubernetes.io/projected/3ce1d921-17fb-427e-bc2b-9df3487b0e5b-kube-api-access-7qgvl\") pod \"nova-api-db-create-zrzlq\" (UID: \"3ce1d921-17fb-427e-bc2b-9df3487b0e5b\") " pod="openstack/nova-api-db-create-zrzlq" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.616848 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce1d921-17fb-427e-bc2b-9df3487b0e5b-operator-scripts\") pod \"nova-api-db-create-zrzlq\" (UID: \"3ce1d921-17fb-427e-bc2b-9df3487b0e5b\") " pod="openstack/nova-api-db-create-zrzlq" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.617686 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce1d921-17fb-427e-bc2b-9df3487b0e5b-operator-scripts\") pod \"nova-api-db-create-zrzlq\" (UID: \"3ce1d921-17fb-427e-bc2b-9df3487b0e5b\") " pod="openstack/nova-api-db-create-zrzlq" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.651859 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qgvl\" (UniqueName: \"kubernetes.io/projected/3ce1d921-17fb-427e-bc2b-9df3487b0e5b-kube-api-access-7qgvl\") pod \"nova-api-db-create-zrzlq\" (UID: \"3ce1d921-17fb-427e-bc2b-9df3487b0e5b\") " pod="openstack/nova-api-db-create-zrzlq" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.700246 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5698-account-create-update-twzbh"] Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.701598 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5698-account-create-update-twzbh" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.705838 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.720701 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cggft\" (UniqueName: \"kubernetes.io/projected/e5d6bee9-1576-4852-a0ca-f4da5d2930f9-kube-api-access-cggft\") pod \"nova-cell0-db-create-8785d\" (UID: \"e5d6bee9-1576-4852-a0ca-f4da5d2930f9\") " pod="openstack/nova-cell0-db-create-8785d" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.720807 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d6bee9-1576-4852-a0ca-f4da5d2930f9-operator-scripts\") pod \"nova-cell0-db-create-8785d\" (UID: \"e5d6bee9-1576-4852-a0ca-f4da5d2930f9\") " pod="openstack/nova-cell0-db-create-8785d" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.777195 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5698-account-create-update-twzbh"] Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.793166 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zrzlq" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.803927 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-j85dq"] Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.805084 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j85dq" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.813974 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-j85dq"] Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.822506 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qksh5\" (UniqueName: \"kubernetes.io/projected/68422fad-6d4a-4d8a-ac66-b68dd7486525-kube-api-access-qksh5\") pod \"nova-api-5698-account-create-update-twzbh\" (UID: \"68422fad-6d4a-4d8a-ac66-b68dd7486525\") " pod="openstack/nova-api-5698-account-create-update-twzbh" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.822616 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68422fad-6d4a-4d8a-ac66-b68dd7486525-operator-scripts\") pod \"nova-api-5698-account-create-update-twzbh\" (UID: \"68422fad-6d4a-4d8a-ac66-b68dd7486525\") " pod="openstack/nova-api-5698-account-create-update-twzbh" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.822739 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cggft\" (UniqueName: \"kubernetes.io/projected/e5d6bee9-1576-4852-a0ca-f4da5d2930f9-kube-api-access-cggft\") pod \"nova-cell0-db-create-8785d\" (UID: \"e5d6bee9-1576-4852-a0ca-f4da5d2930f9\") " pod="openstack/nova-cell0-db-create-8785d" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.822931 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d6bee9-1576-4852-a0ca-f4da5d2930f9-operator-scripts\") pod \"nova-cell0-db-create-8785d\" (UID: \"e5d6bee9-1576-4852-a0ca-f4da5d2930f9\") " pod="openstack/nova-cell0-db-create-8785d" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.823789 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d6bee9-1576-4852-a0ca-f4da5d2930f9-operator-scripts\") pod \"nova-cell0-db-create-8785d\" (UID: \"e5d6bee9-1576-4852-a0ca-f4da5d2930f9\") " pod="openstack/nova-cell0-db-create-8785d" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.857900 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cggft\" (UniqueName: \"kubernetes.io/projected/e5d6bee9-1576-4852-a0ca-f4da5d2930f9-kube-api-access-cggft\") pod \"nova-cell0-db-create-8785d\" (UID: \"e5d6bee9-1576-4852-a0ca-f4da5d2930f9\") " pod="openstack/nova-cell0-db-create-8785d" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.882422 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-fba7-account-create-update-lf5dc"] Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.883920 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.889323 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.900221 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fba7-account-create-update-lf5dc"] Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.922614 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8785d" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.924266 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68422fad-6d4a-4d8a-ac66-b68dd7486525-operator-scripts\") pod \"nova-api-5698-account-create-update-twzbh\" (UID: \"68422fad-6d4a-4d8a-ac66-b68dd7486525\") " pod="openstack/nova-api-5698-account-create-update-twzbh" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.924392 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fh67\" (UniqueName: \"kubernetes.io/projected/1927dc12-81aa-463b-b356-3784c72f3245-kube-api-access-6fh67\") pod \"nova-cell0-fba7-account-create-update-lf5dc\" (UID: \"1927dc12-81aa-463b-b356-3784c72f3245\") " pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.924502 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6bda2b4-31d6-4059-a494-330f9fa2c1f9-operator-scripts\") pod \"nova-cell1-db-create-j85dq\" (UID: \"e6bda2b4-31d6-4059-a494-330f9fa2c1f9\") " pod="openstack/nova-cell1-db-create-j85dq" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.924571 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fhzr\" (UniqueName: \"kubernetes.io/projected/e6bda2b4-31d6-4059-a494-330f9fa2c1f9-kube-api-access-6fhzr\") pod \"nova-cell1-db-create-j85dq\" (UID: \"e6bda2b4-31d6-4059-a494-330f9fa2c1f9\") " pod="openstack/nova-cell1-db-create-j85dq" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.924652 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1927dc12-81aa-463b-b356-3784c72f3245-operator-scripts\") pod \"nova-cell0-fba7-account-create-update-lf5dc\" (UID: \"1927dc12-81aa-463b-b356-3784c72f3245\") " pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.924780 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qksh5\" (UniqueName: \"kubernetes.io/projected/68422fad-6d4a-4d8a-ac66-b68dd7486525-kube-api-access-qksh5\") pod \"nova-api-5698-account-create-update-twzbh\" (UID: \"68422fad-6d4a-4d8a-ac66-b68dd7486525\") " pod="openstack/nova-api-5698-account-create-update-twzbh" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.925858 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68422fad-6d4a-4d8a-ac66-b68dd7486525-operator-scripts\") pod \"nova-api-5698-account-create-update-twzbh\" (UID: \"68422fad-6d4a-4d8a-ac66-b68dd7486525\") " pod="openstack/nova-api-5698-account-create-update-twzbh" Mar 09 09:28:03 crc kubenswrapper[4792]: I0309 09:28:03.949217 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qksh5\" (UniqueName: \"kubernetes.io/projected/68422fad-6d4a-4d8a-ac66-b68dd7486525-kube-api-access-qksh5\") pod \"nova-api-5698-account-create-update-twzbh\" (UID: \"68422fad-6d4a-4d8a-ac66-b68dd7486525\") " pod="openstack/nova-api-5698-account-create-update-twzbh" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.027902 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6bda2b4-31d6-4059-a494-330f9fa2c1f9-operator-scripts\") pod \"nova-cell1-db-create-j85dq\" (UID: \"e6bda2b4-31d6-4059-a494-330f9fa2c1f9\") " pod="openstack/nova-cell1-db-create-j85dq" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.027949 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fhzr\" (UniqueName: \"kubernetes.io/projected/e6bda2b4-31d6-4059-a494-330f9fa2c1f9-kube-api-access-6fhzr\") pod \"nova-cell1-db-create-j85dq\" (UID: \"e6bda2b4-31d6-4059-a494-330f9fa2c1f9\") " pod="openstack/nova-cell1-db-create-j85dq" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.027979 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1927dc12-81aa-463b-b356-3784c72f3245-operator-scripts\") pod \"nova-cell0-fba7-account-create-update-lf5dc\" (UID: \"1927dc12-81aa-463b-b356-3784c72f3245\") " pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.028143 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fh67\" (UniqueName: \"kubernetes.io/projected/1927dc12-81aa-463b-b356-3784c72f3245-kube-api-access-6fh67\") pod \"nova-cell0-fba7-account-create-update-lf5dc\" (UID: \"1927dc12-81aa-463b-b356-3784c72f3245\") " pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.029199 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1927dc12-81aa-463b-b356-3784c72f3245-operator-scripts\") pod \"nova-cell0-fba7-account-create-update-lf5dc\" (UID: \"1927dc12-81aa-463b-b356-3784c72f3245\") " pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.030423 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6bda2b4-31d6-4059-a494-330f9fa2c1f9-operator-scripts\") pod \"nova-cell1-db-create-j85dq\" (UID: \"e6bda2b4-31d6-4059-a494-330f9fa2c1f9\") " pod="openstack/nova-cell1-db-create-j85dq" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.047731 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fhzr\" (UniqueName: \"kubernetes.io/projected/e6bda2b4-31d6-4059-a494-330f9fa2c1f9-kube-api-access-6fhzr\") pod \"nova-cell1-db-create-j85dq\" (UID: \"e6bda2b4-31d6-4059-a494-330f9fa2c1f9\") " pod="openstack/nova-cell1-db-create-j85dq" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.050431 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fh67\" (UniqueName: \"kubernetes.io/projected/1927dc12-81aa-463b-b356-3784c72f3245-kube-api-access-6fh67\") pod \"nova-cell0-fba7-account-create-update-lf5dc\" (UID: \"1927dc12-81aa-463b-b356-3784c72f3245\") " pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.058445 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5698-account-create-update-twzbh" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.089702 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4584-account-create-update-8wfnh"] Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.090999 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4584-account-create-update-8wfnh" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.094539 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.123569 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j85dq" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.125213 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4584-account-create-update-8wfnh"] Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.130380 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d28e5b5b-e371-4a9d-8725-017aa98ac944-operator-scripts\") pod \"nova-cell1-4584-account-create-update-8wfnh\" (UID: \"d28e5b5b-e371-4a9d-8725-017aa98ac944\") " pod="openstack/nova-cell1-4584-account-create-update-8wfnh" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.130513 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8d84\" (UniqueName: \"kubernetes.io/projected/d28e5b5b-e371-4a9d-8725-017aa98ac944-kube-api-access-t8d84\") pod \"nova-cell1-4584-account-create-update-8wfnh\" (UID: \"d28e5b5b-e371-4a9d-8725-017aa98ac944\") " pod="openstack/nova-cell1-4584-account-create-update-8wfnh" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.232634 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8d84\" (UniqueName: \"kubernetes.io/projected/d28e5b5b-e371-4a9d-8725-017aa98ac944-kube-api-access-t8d84\") pod \"nova-cell1-4584-account-create-update-8wfnh\" (UID: \"d28e5b5b-e371-4a9d-8725-017aa98ac944\") " pod="openstack/nova-cell1-4584-account-create-update-8wfnh" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.232791 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d28e5b5b-e371-4a9d-8725-017aa98ac944-operator-scripts\") pod \"nova-cell1-4584-account-create-update-8wfnh\" (UID: \"d28e5b5b-e371-4a9d-8725-017aa98ac944\") " pod="openstack/nova-cell1-4584-account-create-update-8wfnh" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.234815 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d28e5b5b-e371-4a9d-8725-017aa98ac944-operator-scripts\") pod \"nova-cell1-4584-account-create-update-8wfnh\" (UID: \"d28e5b5b-e371-4a9d-8725-017aa98ac944\") " pod="openstack/nova-cell1-4584-account-create-update-8wfnh" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.242230 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.260177 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8d84\" (UniqueName: \"kubernetes.io/projected/d28e5b5b-e371-4a9d-8725-017aa98ac944-kube-api-access-t8d84\") pod \"nova-cell1-4584-account-create-update-8wfnh\" (UID: \"d28e5b5b-e371-4a9d-8725-017aa98ac944\") " pod="openstack/nova-cell1-4584-account-create-update-8wfnh" Mar 09 09:28:04 crc kubenswrapper[4792]: I0309 09:28:04.436232 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4584-account-create-update-8wfnh" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.127837 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550808-tq64m" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.269302 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw8rr\" (UniqueName: \"kubernetes.io/projected/a78e893e-c5f1-4524-9d89-39624fbaec63-kube-api-access-pw8rr\") pod \"a78e893e-c5f1-4524-9d89-39624fbaec63\" (UID: \"a78e893e-c5f1-4524-9d89-39624fbaec63\") " Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.303419 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78e893e-c5f1-4524-9d89-39624fbaec63-kube-api-access-pw8rr" (OuterVolumeSpecName: "kube-api-access-pw8rr") pod "a78e893e-c5f1-4524-9d89-39624fbaec63" (UID: "a78e893e-c5f1-4524-9d89-39624fbaec63"). InnerVolumeSpecName "kube-api-access-pw8rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.373917 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw8rr\" (UniqueName: \"kubernetes.io/projected/a78e893e-c5f1-4524-9d89-39624fbaec63-kube-api-access-pw8rr\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.479731 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550808-tq64m" event={"ID":"a78e893e-c5f1-4524-9d89-39624fbaec63","Type":"ContainerDied","Data":"4e23c91ad32af8cd6b453d1606e7b16049780b24ac2db8370914ed0c0d563c47"} Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.479771 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e23c91ad32af8cd6b453d1606e7b16049780b24ac2db8370914ed0c0d563c47" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.479824 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550808-tq64m" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.482869 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.484767 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b07b74f-754c-43a7-97fb-a0fccb9c5df4","Type":"ContainerDied","Data":"a7b45eeca0866f76a1fd2e490d769df545bf1cba304aa94880eafd14b6aa64ef"} Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.484824 4792 scope.go:117] "RemoveContainer" containerID="84e0df3f63ac0b463918173b2faafdefac74af5fb2dcfe9081d17d537acbc6ff" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.548131 4792 scope.go:117] "RemoveContainer" containerID="2de19365c617c18e8ca70fd82e0b913fa43ed4fa2263c3bfe6c92229347f6be9" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.576721 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-combined-ca-bundle\") pod \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.577124 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-run-httpd\") pod \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.577174 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9bzn\" (UniqueName: \"kubernetes.io/projected/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-kube-api-access-f9bzn\") pod \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.577270 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-config-data\") pod \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.577370 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-scripts\") pod \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.577404 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-log-httpd\") pod \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.577432 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-sg-core-conf-yaml\") pod \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\" (UID: \"6b07b74f-754c-43a7-97fb-a0fccb9c5df4\") " Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.586628 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6b07b74f-754c-43a7-97fb-a0fccb9c5df4" (UID: "6b07b74f-754c-43a7-97fb-a0fccb9c5df4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.587378 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6b07b74f-754c-43a7-97fb-a0fccb9c5df4" (UID: "6b07b74f-754c-43a7-97fb-a0fccb9c5df4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.600059 4792 scope.go:117] "RemoveContainer" containerID="a1defc8ff6c6464e89e8fd4d8f496ae6d26672089dc614aae2dbc712eec7e8b0" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.627378 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-scripts" (OuterVolumeSpecName: "scripts") pod "6b07b74f-754c-43a7-97fb-a0fccb9c5df4" (UID: "6b07b74f-754c-43a7-97fb-a0fccb9c5df4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.627711 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-kube-api-access-f9bzn" (OuterVolumeSpecName: "kube-api-access-f9bzn") pod "6b07b74f-754c-43a7-97fb-a0fccb9c5df4" (UID: "6b07b74f-754c-43a7-97fb-a0fccb9c5df4"). InnerVolumeSpecName "kube-api-access-f9bzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.675969 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6b07b74f-754c-43a7-97fb-a0fccb9c5df4" (UID: "6b07b74f-754c-43a7-97fb-a0fccb9c5df4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.685387 4792 scope.go:117] "RemoveContainer" containerID="26de45986757fcb8064c19b2c93d3222296aad85238631a200c492e107a4ad75" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.690994 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.691022 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9bzn\" (UniqueName: \"kubernetes.io/projected/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-kube-api-access-f9bzn\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.691032 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.691041 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.691050 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.793494 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b07b74f-754c-43a7-97fb-a0fccb9c5df4" (UID: "6b07b74f-754c-43a7-97fb-a0fccb9c5df4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.796769 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.867340 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-config-data" (OuterVolumeSpecName: "config-data") pod "6b07b74f-754c-43a7-97fb-a0fccb9c5df4" (UID: "6b07b74f-754c-43a7-97fb-a0fccb9c5df4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:10 crc kubenswrapper[4792]: I0309 09:28:10.901292 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b07b74f-754c-43a7-97fb-a0fccb9c5df4-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.229622 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550802-wths8"] Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.239522 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550802-wths8"] Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.335557 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fba7-account-create-update-lf5dc"] Mar 09 09:28:11 crc kubenswrapper[4792]: W0309 09:28:11.340035 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1927dc12_81aa_463b_b356_3784c72f3245.slice/crio-8bba1dc1d8cacb61b7ab27a2317c991e9d5c0b6d568d7552f99bc47434486507 WatchSource:0}: Error finding container 8bba1dc1d8cacb61b7ab27a2317c991e9d5c0b6d568d7552f99bc47434486507: Status 404 returned error can't find the container with id 8bba1dc1d8cacb61b7ab27a2317c991e9d5c0b6d568d7552f99bc47434486507 Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.369059 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-8785d"] Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.382150 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zrzlq"] Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.506238 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5698-account-create-update-twzbh"] Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.535867 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.577984 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-j85dq"] Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.578239 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zrzlq" event={"ID":"3ce1d921-17fb-427e-bc2b-9df3487b0e5b","Type":"ContainerStarted","Data":"d9bb53a31e9b40c8aad79970ebf914e07fe5dba00637ddc4b844b9a3fa2eb6b5"} Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.587859 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4584-account-create-update-8wfnh"] Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.603296 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"09fc64e5-4201-410d-a764-789e1dc85ac0","Type":"ContainerStarted","Data":"7b5562efa2555d79245d2302d311a69bdcd82d7640e152d21a799fd6d06fcb30"} Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.609805 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.610807 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4584-account-create-update-8wfnh" event={"ID":"d28e5b5b-e371-4a9d-8725-017aa98ac944","Type":"ContainerStarted","Data":"3b4509095c9866a7f9275d0f77d8faeb571c55bff7fdbc5d0a827ab3c2e12e57"} Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.611560 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" event={"ID":"1927dc12-81aa-463b-b356-3784c72f3245","Type":"ContainerStarted","Data":"8bba1dc1d8cacb61b7ab27a2317c991e9d5c0b6d568d7552f99bc47434486507"} Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.612755 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8785d" event={"ID":"e5d6bee9-1576-4852-a0ca-f4da5d2930f9","Type":"ContainerStarted","Data":"f87e64096bf2e475b4dbf21f1f01a9ec367557955e240a9c87488a846eca851a"} Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.614272 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5698-account-create-update-twzbh" event={"ID":"68422fad-6d4a-4d8a-ac66-b68dd7486525","Type":"ContainerStarted","Data":"a07a64a5ed7c1ed71ff678f3dc5edc08a7f292976bba98e0b793e4f08e8e4866"} Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.624109 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.637216 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j85dq" event={"ID":"e6bda2b4-31d6-4059-a494-330f9fa2c1f9","Type":"ContainerStarted","Data":"7b9dd166480737cbc07ee9cb62075b957c8dee28c75072b207f638a7380c9156"} Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.701803 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.506484907 podStartE2EDuration="18.701783731s" podCreationTimestamp="2026-03-09 09:27:53 +0000 UTC" firstStartedPulling="2026-03-09 09:27:54.843660339 +0000 UTC m=+1239.873861091" lastFinishedPulling="2026-03-09 09:28:10.038959163 +0000 UTC m=+1255.069159915" observedRunningTime="2026-03-09 09:28:11.647817686 +0000 UTC m=+1256.678018438" watchObservedRunningTime="2026-03-09 09:28:11.701783731 +0000 UTC m=+1256.731984483" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.723095 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" path="/var/lib/kubelet/pods/6b07b74f-754c-43a7-97fb-a0fccb9c5df4/volumes" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.730686 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b" path="/var/lib/kubelet/pods/b21ea9f4-b8cf-42c7-8df7-ba22b8e1df2b/volumes" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.732018 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:11 crc kubenswrapper[4792]: E0309 09:28:11.732367 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="sg-core" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.732385 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="sg-core" Mar 09 09:28:11 crc kubenswrapper[4792]: E0309 09:28:11.732409 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="proxy-httpd" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.732415 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="proxy-httpd" Mar 09 09:28:11 crc kubenswrapper[4792]: E0309 09:28:11.732429 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78e893e-c5f1-4524-9d89-39624fbaec63" containerName="oc" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.732435 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78e893e-c5f1-4524-9d89-39624fbaec63" containerName="oc" Mar 09 09:28:11 crc kubenswrapper[4792]: E0309 09:28:11.732445 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="ceilometer-notification-agent" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.732451 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="ceilometer-notification-agent" Mar 09 09:28:11 crc kubenswrapper[4792]: E0309 09:28:11.732467 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="ceilometer-central-agent" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.732473 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="ceilometer-central-agent" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.732619 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="proxy-httpd" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.732637 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a78e893e-c5f1-4524-9d89-39624fbaec63" containerName="oc" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.732650 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="ceilometer-notification-agent" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.732661 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="sg-core" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.732669 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b07b74f-754c-43a7-97fb-a0fccb9c5df4" containerName="ceilometer-central-agent" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.735275 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.735398 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.738589 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.739448 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.857741 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-scripts\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.858054 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.858120 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b40b963a-896c-4714-b8cd-8e68d0c3251a-log-httpd\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.858148 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-config-data\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.858199 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tgk5\" (UniqueName: \"kubernetes.io/projected/b40b963a-896c-4714-b8cd-8e68d0c3251a-kube-api-access-2tgk5\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.858299 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.858367 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b40b963a-896c-4714-b8cd-8e68d0c3251a-run-httpd\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.960753 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b40b963a-896c-4714-b8cd-8e68d0c3251a-run-httpd\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.960840 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-scripts\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.960966 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.960986 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b40b963a-896c-4714-b8cd-8e68d0c3251a-log-httpd\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.961004 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-config-data\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.961033 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tgk5\" (UniqueName: \"kubernetes.io/projected/b40b963a-896c-4714-b8cd-8e68d0c3251a-kube-api-access-2tgk5\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.961057 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.962755 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b40b963a-896c-4714-b8cd-8e68d0c3251a-run-httpd\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.966205 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b40b963a-896c-4714-b8cd-8e68d0c3251a-log-httpd\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.969979 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.987814 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-scripts\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:11 crc kubenswrapper[4792]: I0309 09:28:11.990454 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-config-data\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.008148 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tgk5\" (UniqueName: \"kubernetes.io/projected/b40b963a-896c-4714-b8cd-8e68d0c3251a-kube-api-access-2tgk5\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.009133 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " pod="openstack/ceilometer-0" Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.065242 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.649925 4792 generic.go:334] "Generic (PLEG): container finished" podID="e6bda2b4-31d6-4059-a494-330f9fa2c1f9" containerID="1d9d918c338f3ee7fa4a3ec993e09b177e706742352483714152175b9bce5112" exitCode=0 Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.650016 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j85dq" event={"ID":"e6bda2b4-31d6-4059-a494-330f9fa2c1f9","Type":"ContainerDied","Data":"1d9d918c338f3ee7fa4a3ec993e09b177e706742352483714152175b9bce5112"} Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.652112 4792 generic.go:334] "Generic (PLEG): container finished" podID="3ce1d921-17fb-427e-bc2b-9df3487b0e5b" containerID="98444155077eba96cc75a0a872affcea3af54654aaacdc1f3df45ee6a747a600" exitCode=0 Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.652184 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zrzlq" event={"ID":"3ce1d921-17fb-427e-bc2b-9df3487b0e5b","Type":"ContainerDied","Data":"98444155077eba96cc75a0a872affcea3af54654aaacdc1f3df45ee6a747a600"} Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.653599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4584-account-create-update-8wfnh" event={"ID":"d28e5b5b-e371-4a9d-8725-017aa98ac944","Type":"ContainerStarted","Data":"b76b345f35a9ad624368a0287e6fdbad3e9ae18371832947178fe555f8c84e3e"} Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.655733 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" event={"ID":"1927dc12-81aa-463b-b356-3784c72f3245","Type":"ContainerStarted","Data":"3ceba6ca0d35b2da076cc69ac19e717443ca1b7d82eb6e577eb1dac2b2517c61"} Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.657023 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8785d" event={"ID":"e5d6bee9-1576-4852-a0ca-f4da5d2930f9","Type":"ContainerStarted","Data":"f1d51af9d96a53944f9002089c2cff0830481349e3a0fd1845e4e9b7d1cfa5a2"} Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.660290 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5698-account-create-update-twzbh" event={"ID":"68422fad-6d4a-4d8a-ac66-b68dd7486525","Type":"ContainerStarted","Data":"b2f8ea7a4938181d7530ee3ceecb654d54cfb49874f2ba7fcdf1718cf22dc91b"} Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.662766 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.715498 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-4584-account-create-update-8wfnh" podStartSLOduration=8.715455565 podStartE2EDuration="8.715455565s" podCreationTimestamp="2026-03-09 09:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:28:12.700985836 +0000 UTC m=+1257.731186588" watchObservedRunningTime="2026-03-09 09:28:12.715455565 +0000 UTC m=+1257.745656327" Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.749110 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.751390 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" podStartSLOduration=9.751372037 podStartE2EDuration="9.751372037s" podCreationTimestamp="2026-03-09 09:28:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:28:12.742283023 +0000 UTC m=+1257.772483785" watchObservedRunningTime="2026-03-09 09:28:12.751372037 +0000 UTC m=+1257.781572789" Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.770937 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-8785d" podStartSLOduration=9.770907153 podStartE2EDuration="9.770907153s" podCreationTimestamp="2026-03-09 09:28:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:28:12.770150651 +0000 UTC m=+1257.800351403" watchObservedRunningTime="2026-03-09 09:28:12.770907153 +0000 UTC m=+1257.801107925" Mar 09 09:28:12 crc kubenswrapper[4792]: W0309 09:28:12.781890 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb40b963a_896c_4714_b8cd_8e68d0c3251a.slice/crio-b27216624ff0573e525d619de4626bec7b46d53e1c945969ab64fdd8fe2427cf WatchSource:0}: Error finding container b27216624ff0573e525d619de4626bec7b46d53e1c945969ab64fdd8fe2427cf: Status 404 returned error can't find the container with id b27216624ff0573e525d619de4626bec7b46d53e1c945969ab64fdd8fe2427cf Mar 09 09:28:12 crc kubenswrapper[4792]: I0309 09:28:12.796856 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-5698-account-create-update-twzbh" podStartSLOduration=9.796819824 podStartE2EDuration="9.796819824s" podCreationTimestamp="2026-03-09 09:28:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:28:12.784027194 +0000 UTC m=+1257.814227946" watchObservedRunningTime="2026-03-09 09:28:12.796819824 +0000 UTC m=+1257.827020576" Mar 09 09:28:13 crc kubenswrapper[4792]: I0309 09:28:13.672100 4792 generic.go:334] "Generic (PLEG): container finished" podID="1927dc12-81aa-463b-b356-3784c72f3245" containerID="3ceba6ca0d35b2da076cc69ac19e717443ca1b7d82eb6e577eb1dac2b2517c61" exitCode=0 Mar 09 09:28:13 crc kubenswrapper[4792]: I0309 09:28:13.672164 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" event={"ID":"1927dc12-81aa-463b-b356-3784c72f3245","Type":"ContainerDied","Data":"3ceba6ca0d35b2da076cc69ac19e717443ca1b7d82eb6e577eb1dac2b2517c61"} Mar 09 09:28:13 crc kubenswrapper[4792]: I0309 09:28:13.675282 4792 generic.go:334] "Generic (PLEG): container finished" podID="e5d6bee9-1576-4852-a0ca-f4da5d2930f9" containerID="f1d51af9d96a53944f9002089c2cff0830481349e3a0fd1845e4e9b7d1cfa5a2" exitCode=0 Mar 09 09:28:13 crc kubenswrapper[4792]: I0309 09:28:13.675376 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8785d" event={"ID":"e5d6bee9-1576-4852-a0ca-f4da5d2930f9","Type":"ContainerDied","Data":"f1d51af9d96a53944f9002089c2cff0830481349e3a0fd1845e4e9b7d1cfa5a2"} Mar 09 09:28:13 crc kubenswrapper[4792]: I0309 09:28:13.676928 4792 generic.go:334] "Generic (PLEG): container finished" podID="68422fad-6d4a-4d8a-ac66-b68dd7486525" containerID="b2f8ea7a4938181d7530ee3ceecb654d54cfb49874f2ba7fcdf1718cf22dc91b" exitCode=0 Mar 09 09:28:13 crc kubenswrapper[4792]: I0309 09:28:13.676975 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5698-account-create-update-twzbh" event={"ID":"68422fad-6d4a-4d8a-ac66-b68dd7486525","Type":"ContainerDied","Data":"b2f8ea7a4938181d7530ee3ceecb654d54cfb49874f2ba7fcdf1718cf22dc91b"} Mar 09 09:28:13 crc kubenswrapper[4792]: I0309 09:28:13.678822 4792 generic.go:334] "Generic (PLEG): container finished" podID="d28e5b5b-e371-4a9d-8725-017aa98ac944" containerID="b76b345f35a9ad624368a0287e6fdbad3e9ae18371832947178fe555f8c84e3e" exitCode=0 Mar 09 09:28:13 crc kubenswrapper[4792]: I0309 09:28:13.678887 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4584-account-create-update-8wfnh" event={"ID":"d28e5b5b-e371-4a9d-8725-017aa98ac944","Type":"ContainerDied","Data":"b76b345f35a9ad624368a0287e6fdbad3e9ae18371832947178fe555f8c84e3e"} Mar 09 09:28:13 crc kubenswrapper[4792]: I0309 09:28:13.680596 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b40b963a-896c-4714-b8cd-8e68d0c3251a","Type":"ContainerStarted","Data":"9e73e719d782724576fe52d5352c086cbe4a80ed5e9107b33fc89f4832d53ef9"} Mar 09 09:28:13 crc kubenswrapper[4792]: I0309 09:28:13.680643 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b40b963a-896c-4714-b8cd-8e68d0c3251a","Type":"ContainerStarted","Data":"b27216624ff0573e525d619de4626bec7b46d53e1c945969ab64fdd8fe2427cf"} Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.257244 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j85dq" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.270975 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zrzlq" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.413489 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qgvl\" (UniqueName: \"kubernetes.io/projected/3ce1d921-17fb-427e-bc2b-9df3487b0e5b-kube-api-access-7qgvl\") pod \"3ce1d921-17fb-427e-bc2b-9df3487b0e5b\" (UID: \"3ce1d921-17fb-427e-bc2b-9df3487b0e5b\") " Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.413588 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fhzr\" (UniqueName: \"kubernetes.io/projected/e6bda2b4-31d6-4059-a494-330f9fa2c1f9-kube-api-access-6fhzr\") pod \"e6bda2b4-31d6-4059-a494-330f9fa2c1f9\" (UID: \"e6bda2b4-31d6-4059-a494-330f9fa2c1f9\") " Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.413755 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce1d921-17fb-427e-bc2b-9df3487b0e5b-operator-scripts\") pod \"3ce1d921-17fb-427e-bc2b-9df3487b0e5b\" (UID: \"3ce1d921-17fb-427e-bc2b-9df3487b0e5b\") " Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.413828 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6bda2b4-31d6-4059-a494-330f9fa2c1f9-operator-scripts\") pod \"e6bda2b4-31d6-4059-a494-330f9fa2c1f9\" (UID: \"e6bda2b4-31d6-4059-a494-330f9fa2c1f9\") " Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.414894 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bda2b4-31d6-4059-a494-330f9fa2c1f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6bda2b4-31d6-4059-a494-330f9fa2c1f9" (UID: "e6bda2b4-31d6-4059-a494-330f9fa2c1f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.415828 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ce1d921-17fb-427e-bc2b-9df3487b0e5b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ce1d921-17fb-427e-bc2b-9df3487b0e5b" (UID: "3ce1d921-17fb-427e-bc2b-9df3487b0e5b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.425637 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6bda2b4-31d6-4059-a494-330f9fa2c1f9-kube-api-access-6fhzr" (OuterVolumeSpecName: "kube-api-access-6fhzr") pod "e6bda2b4-31d6-4059-a494-330f9fa2c1f9" (UID: "e6bda2b4-31d6-4059-a494-330f9fa2c1f9"). InnerVolumeSpecName "kube-api-access-6fhzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.428516 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce1d921-17fb-427e-bc2b-9df3487b0e5b-kube-api-access-7qgvl" (OuterVolumeSpecName: "kube-api-access-7qgvl") pod "3ce1d921-17fb-427e-bc2b-9df3487b0e5b" (UID: "3ce1d921-17fb-427e-bc2b-9df3487b0e5b"). InnerVolumeSpecName "kube-api-access-7qgvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.517760 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ce1d921-17fb-427e-bc2b-9df3487b0e5b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.518006 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6bda2b4-31d6-4059-a494-330f9fa2c1f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.518016 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qgvl\" (UniqueName: \"kubernetes.io/projected/3ce1d921-17fb-427e-bc2b-9df3487b0e5b-kube-api-access-7qgvl\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.518027 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fhzr\" (UniqueName: \"kubernetes.io/projected/e6bda2b4-31d6-4059-a494-330f9fa2c1f9-kube-api-access-6fhzr\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.701350 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j85dq" event={"ID":"e6bda2b4-31d6-4059-a494-330f9fa2c1f9","Type":"ContainerDied","Data":"7b9dd166480737cbc07ee9cb62075b957c8dee28c75072b207f638a7380c9156"} Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.701395 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b9dd166480737cbc07ee9cb62075b957c8dee28c75072b207f638a7380c9156" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.701452 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j85dq" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.707921 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zrzlq" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.708865 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zrzlq" event={"ID":"3ce1d921-17fb-427e-bc2b-9df3487b0e5b","Type":"ContainerDied","Data":"d9bb53a31e9b40c8aad79970ebf914e07fe5dba00637ddc4b844b9a3fa2eb6b5"} Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.708928 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9bb53a31e9b40c8aad79970ebf914e07fe5dba00637ddc4b844b9a3fa2eb6b5" Mar 09 09:28:14 crc kubenswrapper[4792]: I0309 09:28:14.724934 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b40b963a-896c-4714-b8cd-8e68d0c3251a","Type":"ContainerStarted","Data":"a76fb0d4592a6572df4bd7f11207f090264194f59a88c1088757119d9bdd85cd"} Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.399826 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5698-account-create-update-twzbh" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.411152 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4584-account-create-update-8wfnh" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.490065 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.500840 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8785d" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.565349 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8d84\" (UniqueName: \"kubernetes.io/projected/d28e5b5b-e371-4a9d-8725-017aa98ac944-kube-api-access-t8d84\") pod \"d28e5b5b-e371-4a9d-8725-017aa98ac944\" (UID: \"d28e5b5b-e371-4a9d-8725-017aa98ac944\") " Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.565453 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68422fad-6d4a-4d8a-ac66-b68dd7486525-operator-scripts\") pod \"68422fad-6d4a-4d8a-ac66-b68dd7486525\" (UID: \"68422fad-6d4a-4d8a-ac66-b68dd7486525\") " Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.565496 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d28e5b5b-e371-4a9d-8725-017aa98ac944-operator-scripts\") pod \"d28e5b5b-e371-4a9d-8725-017aa98ac944\" (UID: \"d28e5b5b-e371-4a9d-8725-017aa98ac944\") " Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.565544 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fh67\" (UniqueName: \"kubernetes.io/projected/1927dc12-81aa-463b-b356-3784c72f3245-kube-api-access-6fh67\") pod \"1927dc12-81aa-463b-b356-3784c72f3245\" (UID: \"1927dc12-81aa-463b-b356-3784c72f3245\") " Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.565570 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1927dc12-81aa-463b-b356-3784c72f3245-operator-scripts\") pod \"1927dc12-81aa-463b-b356-3784c72f3245\" (UID: \"1927dc12-81aa-463b-b356-3784c72f3245\") " Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.565650 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qksh5\" (UniqueName: \"kubernetes.io/projected/68422fad-6d4a-4d8a-ac66-b68dd7486525-kube-api-access-qksh5\") pod \"68422fad-6d4a-4d8a-ac66-b68dd7486525\" (UID: \"68422fad-6d4a-4d8a-ac66-b68dd7486525\") " Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.567331 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d28e5b5b-e371-4a9d-8725-017aa98ac944-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d28e5b5b-e371-4a9d-8725-017aa98ac944" (UID: "d28e5b5b-e371-4a9d-8725-017aa98ac944"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.570503 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68422fad-6d4a-4d8a-ac66-b68dd7486525-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68422fad-6d4a-4d8a-ac66-b68dd7486525" (UID: "68422fad-6d4a-4d8a-ac66-b68dd7486525"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.572359 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1927dc12-81aa-463b-b356-3784c72f3245-kube-api-access-6fh67" (OuterVolumeSpecName: "kube-api-access-6fh67") pod "1927dc12-81aa-463b-b356-3784c72f3245" (UID: "1927dc12-81aa-463b-b356-3784c72f3245"). InnerVolumeSpecName "kube-api-access-6fh67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.573151 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1927dc12-81aa-463b-b356-3784c72f3245-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1927dc12-81aa-463b-b356-3784c72f3245" (UID: "1927dc12-81aa-463b-b356-3784c72f3245"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.573327 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68422fad-6d4a-4d8a-ac66-b68dd7486525-kube-api-access-qksh5" (OuterVolumeSpecName: "kube-api-access-qksh5") pod "68422fad-6d4a-4d8a-ac66-b68dd7486525" (UID: "68422fad-6d4a-4d8a-ac66-b68dd7486525"). InnerVolumeSpecName "kube-api-access-qksh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.573419 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d28e5b5b-e371-4a9d-8725-017aa98ac944-kube-api-access-t8d84" (OuterVolumeSpecName: "kube-api-access-t8d84") pod "d28e5b5b-e371-4a9d-8725-017aa98ac944" (UID: "d28e5b5b-e371-4a9d-8725-017aa98ac944"). InnerVolumeSpecName "kube-api-access-t8d84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.667474 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d6bee9-1576-4852-a0ca-f4da5d2930f9-operator-scripts\") pod \"e5d6bee9-1576-4852-a0ca-f4da5d2930f9\" (UID: \"e5d6bee9-1576-4852-a0ca-f4da5d2930f9\") " Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.667794 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cggft\" (UniqueName: \"kubernetes.io/projected/e5d6bee9-1576-4852-a0ca-f4da5d2930f9-kube-api-access-cggft\") pod \"e5d6bee9-1576-4852-a0ca-f4da5d2930f9\" (UID: \"e5d6bee9-1576-4852-a0ca-f4da5d2930f9\") " Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.668573 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68422fad-6d4a-4d8a-ac66-b68dd7486525-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.668596 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d28e5b5b-e371-4a9d-8725-017aa98ac944-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.668609 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fh67\" (UniqueName: \"kubernetes.io/projected/1927dc12-81aa-463b-b356-3784c72f3245-kube-api-access-6fh67\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.668624 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1927dc12-81aa-463b-b356-3784c72f3245-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.668635 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qksh5\" (UniqueName: \"kubernetes.io/projected/68422fad-6d4a-4d8a-ac66-b68dd7486525-kube-api-access-qksh5\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.668644 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8d84\" (UniqueName: \"kubernetes.io/projected/d28e5b5b-e371-4a9d-8725-017aa98ac944-kube-api-access-t8d84\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.676315 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d6bee9-1576-4852-a0ca-f4da5d2930f9-kube-api-access-cggft" (OuterVolumeSpecName: "kube-api-access-cggft") pod "e5d6bee9-1576-4852-a0ca-f4da5d2930f9" (UID: "e5d6bee9-1576-4852-a0ca-f4da5d2930f9"). InnerVolumeSpecName "kube-api-access-cggft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.676666 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5d6bee9-1576-4852-a0ca-f4da5d2930f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5d6bee9-1576-4852-a0ca-f4da5d2930f9" (UID: "e5d6bee9-1576-4852-a0ca-f4da5d2930f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.763264 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.764138 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fba7-account-create-update-lf5dc" event={"ID":"1927dc12-81aa-463b-b356-3784c72f3245","Type":"ContainerDied","Data":"8bba1dc1d8cacb61b7ab27a2317c991e9d5c0b6d568d7552f99bc47434486507"} Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.765547 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bba1dc1d8cacb61b7ab27a2317c991e9d5c0b6d568d7552f99bc47434486507" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.766384 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-8785d" event={"ID":"e5d6bee9-1576-4852-a0ca-f4da5d2930f9","Type":"ContainerDied","Data":"f87e64096bf2e475b4dbf21f1f01a9ec367557955e240a9c87488a846eca851a"} Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.766413 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f87e64096bf2e475b4dbf21f1f01a9ec367557955e240a9c87488a846eca851a" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.766418 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-8785d" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.768103 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5698-account-create-update-twzbh" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.768515 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5698-account-create-update-twzbh" event={"ID":"68422fad-6d4a-4d8a-ac66-b68dd7486525","Type":"ContainerDied","Data":"a07a64a5ed7c1ed71ff678f3dc5edc08a7f292976bba98e0b793e4f08e8e4866"} Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.768543 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a07a64a5ed7c1ed71ff678f3dc5edc08a7f292976bba98e0b793e4f08e8e4866" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.769347 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4584-account-create-update-8wfnh" event={"ID":"d28e5b5b-e371-4a9d-8725-017aa98ac944","Type":"ContainerDied","Data":"3b4509095c9866a7f9275d0f77d8faeb571c55bff7fdbc5d0a827ab3c2e12e57"} Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.769368 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b4509095c9866a7f9275d0f77d8faeb571c55bff7fdbc5d0a827ab3c2e12e57" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.769432 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4584-account-create-update-8wfnh" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.769796 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cggft\" (UniqueName: \"kubernetes.io/projected/e5d6bee9-1576-4852-a0ca-f4da5d2930f9-kube-api-access-cggft\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.771996 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d6bee9-1576-4852-a0ca-f4da5d2930f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:15 crc kubenswrapper[4792]: I0309 09:28:15.772517 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b40b963a-896c-4714-b8cd-8e68d0c3251a","Type":"ContainerStarted","Data":"07459cdb5fa9e2c971211a3b21f57bba7a0bdb4f4b9e4d0b6f660205210129a5"} Mar 09 09:28:17 crc kubenswrapper[4792]: I0309 09:28:17.792689 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b40b963a-896c-4714-b8cd-8e68d0c3251a","Type":"ContainerStarted","Data":"a21da89c2b5228b6ccbf7a83f453cc4b092c1f9236b7a10d9b87ca36c3756fa1"} Mar 09 09:28:17 crc kubenswrapper[4792]: I0309 09:28:17.792843 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="ceilometer-central-agent" containerID="cri-o://9e73e719d782724576fe52d5352c086cbe4a80ed5e9107b33fc89f4832d53ef9" gracePeriod=30 Mar 09 09:28:17 crc kubenswrapper[4792]: I0309 09:28:17.792866 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="proxy-httpd" containerID="cri-o://a21da89c2b5228b6ccbf7a83f453cc4b092c1f9236b7a10d9b87ca36c3756fa1" gracePeriod=30 Mar 09 09:28:17 crc kubenswrapper[4792]: I0309 09:28:17.792872 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="sg-core" containerID="cri-o://07459cdb5fa9e2c971211a3b21f57bba7a0bdb4f4b9e4d0b6f660205210129a5" gracePeriod=30 Mar 09 09:28:17 crc kubenswrapper[4792]: I0309 09:28:17.792934 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="ceilometer-notification-agent" containerID="cri-o://a76fb0d4592a6572df4bd7f11207f090264194f59a88c1088757119d9bdd85cd" gracePeriod=30 Mar 09 09:28:17 crc kubenswrapper[4792]: I0309 09:28:17.793262 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:28:17 crc kubenswrapper[4792]: I0309 09:28:17.821495 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.42569162 podStartE2EDuration="6.821462986s" podCreationTimestamp="2026-03-09 09:28:11 +0000 UTC" firstStartedPulling="2026-03-09 09:28:12.784687592 +0000 UTC m=+1257.814888344" lastFinishedPulling="2026-03-09 09:28:17.180458958 +0000 UTC m=+1262.210659710" observedRunningTime="2026-03-09 09:28:17.81471353 +0000 UTC m=+1262.844914282" watchObservedRunningTime="2026-03-09 09:28:17.821462986 +0000 UTC m=+1262.851663748" Mar 09 09:28:18 crc kubenswrapper[4792]: I0309 09:28:18.804432 4792 generic.go:334] "Generic (PLEG): container finished" podID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerID="a21da89c2b5228b6ccbf7a83f453cc4b092c1f9236b7a10d9b87ca36c3756fa1" exitCode=0 Mar 09 09:28:18 crc kubenswrapper[4792]: I0309 09:28:18.804483 4792 generic.go:334] "Generic (PLEG): container finished" podID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerID="07459cdb5fa9e2c971211a3b21f57bba7a0bdb4f4b9e4d0b6f660205210129a5" exitCode=2 Mar 09 09:28:18 crc kubenswrapper[4792]: I0309 09:28:18.804492 4792 generic.go:334] "Generic (PLEG): container finished" podID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerID="a76fb0d4592a6572df4bd7f11207f090264194f59a88c1088757119d9bdd85cd" exitCode=0 Mar 09 09:28:18 crc kubenswrapper[4792]: I0309 09:28:18.804513 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b40b963a-896c-4714-b8cd-8e68d0c3251a","Type":"ContainerDied","Data":"a21da89c2b5228b6ccbf7a83f453cc4b092c1f9236b7a10d9b87ca36c3756fa1"} Mar 09 09:28:18 crc kubenswrapper[4792]: I0309 09:28:18.804559 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b40b963a-896c-4714-b8cd-8e68d0c3251a","Type":"ContainerDied","Data":"07459cdb5fa9e2c971211a3b21f57bba7a0bdb4f4b9e4d0b6f660205210129a5"} Mar 09 09:28:18 crc kubenswrapper[4792]: I0309 09:28:18.804570 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b40b963a-896c-4714-b8cd-8e68d0c3251a","Type":"ContainerDied","Data":"a76fb0d4592a6572df4bd7f11207f090264194f59a88c1088757119d9bdd85cd"} Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.328157 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-57vwq"] Mar 09 09:28:19 crc kubenswrapper[4792]: E0309 09:28:19.329059 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce1d921-17fb-427e-bc2b-9df3487b0e5b" containerName="mariadb-database-create" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.329185 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce1d921-17fb-427e-bc2b-9df3487b0e5b" containerName="mariadb-database-create" Mar 09 09:28:19 crc kubenswrapper[4792]: E0309 09:28:19.329247 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68422fad-6d4a-4d8a-ac66-b68dd7486525" containerName="mariadb-account-create-update" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.329296 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="68422fad-6d4a-4d8a-ac66-b68dd7486525" containerName="mariadb-account-create-update" Mar 09 09:28:19 crc kubenswrapper[4792]: E0309 09:28:19.329355 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6bda2b4-31d6-4059-a494-330f9fa2c1f9" containerName="mariadb-database-create" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.329404 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6bda2b4-31d6-4059-a494-330f9fa2c1f9" containerName="mariadb-database-create" Mar 09 09:28:19 crc kubenswrapper[4792]: E0309 09:28:19.329472 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1927dc12-81aa-463b-b356-3784c72f3245" containerName="mariadb-account-create-update" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.329524 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1927dc12-81aa-463b-b356-3784c72f3245" containerName="mariadb-account-create-update" Mar 09 09:28:19 crc kubenswrapper[4792]: E0309 09:28:19.329586 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5d6bee9-1576-4852-a0ca-f4da5d2930f9" containerName="mariadb-database-create" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.329640 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d6bee9-1576-4852-a0ca-f4da5d2930f9" containerName="mariadb-database-create" Mar 09 09:28:19 crc kubenswrapper[4792]: E0309 09:28:19.329704 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d28e5b5b-e371-4a9d-8725-017aa98ac944" containerName="mariadb-account-create-update" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.329753 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d28e5b5b-e371-4a9d-8725-017aa98ac944" containerName="mariadb-account-create-update" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.329946 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="68422fad-6d4a-4d8a-ac66-b68dd7486525" containerName="mariadb-account-create-update" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.330004 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5d6bee9-1576-4852-a0ca-f4da5d2930f9" containerName="mariadb-database-create" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.330111 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce1d921-17fb-427e-bc2b-9df3487b0e5b" containerName="mariadb-database-create" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.330182 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6bda2b4-31d6-4059-a494-330f9fa2c1f9" containerName="mariadb-database-create" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.330237 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1927dc12-81aa-463b-b356-3784c72f3245" containerName="mariadb-account-create-update" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.330291 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d28e5b5b-e371-4a9d-8725-017aa98ac944" containerName="mariadb-account-create-update" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.330868 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.342059 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6w5mj" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.342481 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.342672 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.347199 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-57vwq"] Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.438312 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-config-data\") pod \"nova-cell0-conductor-db-sync-57vwq\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.438462 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-scripts\") pod \"nova-cell0-conductor-db-sync-57vwq\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.438549 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-57vwq\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.438587 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc9vg\" (UniqueName: \"kubernetes.io/projected/65c96452-6003-4585-b71c-dbb398600617-kube-api-access-xc9vg\") pod \"nova-cell0-conductor-db-sync-57vwq\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.539683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-config-data\") pod \"nova-cell0-conductor-db-sync-57vwq\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.539817 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-scripts\") pod \"nova-cell0-conductor-db-sync-57vwq\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.539914 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-57vwq\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.539953 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc9vg\" (UniqueName: \"kubernetes.io/projected/65c96452-6003-4585-b71c-dbb398600617-kube-api-access-xc9vg\") pod \"nova-cell0-conductor-db-sync-57vwq\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.548272 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-scripts\") pod \"nova-cell0-conductor-db-sync-57vwq\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.548358 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-57vwq\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.549125 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-config-data\") pod \"nova-cell0-conductor-db-sync-57vwq\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.553275 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.572263 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc9vg\" (UniqueName: \"kubernetes.io/projected/65c96452-6003-4585-b71c-dbb398600617-kube-api-access-xc9vg\") pod \"nova-cell0-conductor-db-sync-57vwq\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.649571 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-config-data\") pod \"b40b963a-896c-4714-b8cd-8e68d0c3251a\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.649628 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b40b963a-896c-4714-b8cd-8e68d0c3251a-run-httpd\") pod \"b40b963a-896c-4714-b8cd-8e68d0c3251a\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.649657 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tgk5\" (UniqueName: \"kubernetes.io/projected/b40b963a-896c-4714-b8cd-8e68d0c3251a-kube-api-access-2tgk5\") pod \"b40b963a-896c-4714-b8cd-8e68d0c3251a\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.649723 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-scripts\") pod \"b40b963a-896c-4714-b8cd-8e68d0c3251a\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.649755 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b40b963a-896c-4714-b8cd-8e68d0c3251a-log-httpd\") pod \"b40b963a-896c-4714-b8cd-8e68d0c3251a\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.649832 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-sg-core-conf-yaml\") pod \"b40b963a-896c-4714-b8cd-8e68d0c3251a\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.649927 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-combined-ca-bundle\") pod \"b40b963a-896c-4714-b8cd-8e68d0c3251a\" (UID: \"b40b963a-896c-4714-b8cd-8e68d0c3251a\") " Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.653748 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b40b963a-896c-4714-b8cd-8e68d0c3251a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b40b963a-896c-4714-b8cd-8e68d0c3251a" (UID: "b40b963a-896c-4714-b8cd-8e68d0c3251a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.653950 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b40b963a-896c-4714-b8cd-8e68d0c3251a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b40b963a-896c-4714-b8cd-8e68d0c3251a" (UID: "b40b963a-896c-4714-b8cd-8e68d0c3251a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.659914 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b40b963a-896c-4714-b8cd-8e68d0c3251a-kube-api-access-2tgk5" (OuterVolumeSpecName: "kube-api-access-2tgk5") pod "b40b963a-896c-4714-b8cd-8e68d0c3251a" (UID: "b40b963a-896c-4714-b8cd-8e68d0c3251a"). InnerVolumeSpecName "kube-api-access-2tgk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.667901 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-scripts" (OuterVolumeSpecName: "scripts") pod "b40b963a-896c-4714-b8cd-8e68d0c3251a" (UID: "b40b963a-896c-4714-b8cd-8e68d0c3251a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.671974 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.733241 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b40b963a-896c-4714-b8cd-8e68d0c3251a" (UID: "b40b963a-896c-4714-b8cd-8e68d0c3251a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.753060 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b40b963a-896c-4714-b8cd-8e68d0c3251a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.753111 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tgk5\" (UniqueName: \"kubernetes.io/projected/b40b963a-896c-4714-b8cd-8e68d0c3251a-kube-api-access-2tgk5\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.753128 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.753136 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b40b963a-896c-4714-b8cd-8e68d0c3251a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.753143 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.826094 4792 generic.go:334] "Generic (PLEG): container finished" podID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerID="9e73e719d782724576fe52d5352c086cbe4a80ed5e9107b33fc89f4832d53ef9" exitCode=0 Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.826158 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b40b963a-896c-4714-b8cd-8e68d0c3251a","Type":"ContainerDied","Data":"9e73e719d782724576fe52d5352c086cbe4a80ed5e9107b33fc89f4832d53ef9"} Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.826187 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b40b963a-896c-4714-b8cd-8e68d0c3251a","Type":"ContainerDied","Data":"b27216624ff0573e525d619de4626bec7b46d53e1c945969ab64fdd8fe2427cf"} Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.826221 4792 scope.go:117] "RemoveContainer" containerID="a21da89c2b5228b6ccbf7a83f453cc4b092c1f9236b7a10d9b87ca36c3756fa1" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.826410 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.827202 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b40b963a-896c-4714-b8cd-8e68d0c3251a" (UID: "b40b963a-896c-4714-b8cd-8e68d0c3251a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.850470 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-config-data" (OuterVolumeSpecName: "config-data") pod "b40b963a-896c-4714-b8cd-8e68d0c3251a" (UID: "b40b963a-896c-4714-b8cd-8e68d0c3251a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.855647 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.855688 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b40b963a-896c-4714-b8cd-8e68d0c3251a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.863861 4792 scope.go:117] "RemoveContainer" containerID="07459cdb5fa9e2c971211a3b21f57bba7a0bdb4f4b9e4d0b6f660205210129a5" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.922278 4792 scope.go:117] "RemoveContainer" containerID="a76fb0d4592a6572df4bd7f11207f090264194f59a88c1088757119d9bdd85cd" Mar 09 09:28:19 crc kubenswrapper[4792]: I0309 09:28:19.964794 4792 scope.go:117] "RemoveContainer" containerID="9e73e719d782724576fe52d5352c086cbe4a80ed5e9107b33fc89f4832d53ef9" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.031326 4792 scope.go:117] "RemoveContainer" containerID="a21da89c2b5228b6ccbf7a83f453cc4b092c1f9236b7a10d9b87ca36c3756fa1" Mar 09 09:28:20 crc kubenswrapper[4792]: E0309 09:28:20.032537 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a21da89c2b5228b6ccbf7a83f453cc4b092c1f9236b7a10d9b87ca36c3756fa1\": container with ID starting with a21da89c2b5228b6ccbf7a83f453cc4b092c1f9236b7a10d9b87ca36c3756fa1 not found: ID does not exist" containerID="a21da89c2b5228b6ccbf7a83f453cc4b092c1f9236b7a10d9b87ca36c3756fa1" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.032576 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a21da89c2b5228b6ccbf7a83f453cc4b092c1f9236b7a10d9b87ca36c3756fa1"} err="failed to get container status \"a21da89c2b5228b6ccbf7a83f453cc4b092c1f9236b7a10d9b87ca36c3756fa1\": rpc error: code = NotFound desc = could not find container \"a21da89c2b5228b6ccbf7a83f453cc4b092c1f9236b7a10d9b87ca36c3756fa1\": container with ID starting with a21da89c2b5228b6ccbf7a83f453cc4b092c1f9236b7a10d9b87ca36c3756fa1 not found: ID does not exist" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.032603 4792 scope.go:117] "RemoveContainer" containerID="07459cdb5fa9e2c971211a3b21f57bba7a0bdb4f4b9e4d0b6f660205210129a5" Mar 09 09:28:20 crc kubenswrapper[4792]: E0309 09:28:20.033424 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07459cdb5fa9e2c971211a3b21f57bba7a0bdb4f4b9e4d0b6f660205210129a5\": container with ID starting with 07459cdb5fa9e2c971211a3b21f57bba7a0bdb4f4b9e4d0b6f660205210129a5 not found: ID does not exist" containerID="07459cdb5fa9e2c971211a3b21f57bba7a0bdb4f4b9e4d0b6f660205210129a5" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.033456 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07459cdb5fa9e2c971211a3b21f57bba7a0bdb4f4b9e4d0b6f660205210129a5"} err="failed to get container status \"07459cdb5fa9e2c971211a3b21f57bba7a0bdb4f4b9e4d0b6f660205210129a5\": rpc error: code = NotFound desc = could not find container \"07459cdb5fa9e2c971211a3b21f57bba7a0bdb4f4b9e4d0b6f660205210129a5\": container with ID starting with 07459cdb5fa9e2c971211a3b21f57bba7a0bdb4f4b9e4d0b6f660205210129a5 not found: ID does not exist" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.033476 4792 scope.go:117] "RemoveContainer" containerID="a76fb0d4592a6572df4bd7f11207f090264194f59a88c1088757119d9bdd85cd" Mar 09 09:28:20 crc kubenswrapper[4792]: E0309 09:28:20.034263 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a76fb0d4592a6572df4bd7f11207f090264194f59a88c1088757119d9bdd85cd\": container with ID starting with a76fb0d4592a6572df4bd7f11207f090264194f59a88c1088757119d9bdd85cd not found: ID does not exist" containerID="a76fb0d4592a6572df4bd7f11207f090264194f59a88c1088757119d9bdd85cd" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.034295 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76fb0d4592a6572df4bd7f11207f090264194f59a88c1088757119d9bdd85cd"} err="failed to get container status \"a76fb0d4592a6572df4bd7f11207f090264194f59a88c1088757119d9bdd85cd\": rpc error: code = NotFound desc = could not find container \"a76fb0d4592a6572df4bd7f11207f090264194f59a88c1088757119d9bdd85cd\": container with ID starting with a76fb0d4592a6572df4bd7f11207f090264194f59a88c1088757119d9bdd85cd not found: ID does not exist" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.034313 4792 scope.go:117] "RemoveContainer" containerID="9e73e719d782724576fe52d5352c086cbe4a80ed5e9107b33fc89f4832d53ef9" Mar 09 09:28:20 crc kubenswrapper[4792]: E0309 09:28:20.035697 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e73e719d782724576fe52d5352c086cbe4a80ed5e9107b33fc89f4832d53ef9\": container with ID starting with 9e73e719d782724576fe52d5352c086cbe4a80ed5e9107b33fc89f4832d53ef9 not found: ID does not exist" containerID="9e73e719d782724576fe52d5352c086cbe4a80ed5e9107b33fc89f4832d53ef9" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.035726 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e73e719d782724576fe52d5352c086cbe4a80ed5e9107b33fc89f4832d53ef9"} err="failed to get container status \"9e73e719d782724576fe52d5352c086cbe4a80ed5e9107b33fc89f4832d53ef9\": rpc error: code = NotFound desc = could not find container \"9e73e719d782724576fe52d5352c086cbe4a80ed5e9107b33fc89f4832d53ef9\": container with ID starting with 9e73e719d782724576fe52d5352c086cbe4a80ed5e9107b33fc89f4832d53ef9 not found: ID does not exist" Mar 09 09:28:20 crc kubenswrapper[4792]: W0309 09:28:20.091154 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65c96452_6003_4585_b71c_dbb398600617.slice/crio-589ed64a1d5e4f2bcf291e17ee6c49b018a5751d6338667218b83689c812481d WatchSource:0}: Error finding container 589ed64a1d5e4f2bcf291e17ee6c49b018a5751d6338667218b83689c812481d: Status 404 returned error can't find the container with id 589ed64a1d5e4f2bcf291e17ee6c49b018a5751d6338667218b83689c812481d Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.091324 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-57vwq"] Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.178112 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.184402 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.214093 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:20 crc kubenswrapper[4792]: E0309 09:28:20.214510 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="ceilometer-central-agent" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.214533 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="ceilometer-central-agent" Mar 09 09:28:20 crc kubenswrapper[4792]: E0309 09:28:20.214557 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="ceilometer-notification-agent" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.214564 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="ceilometer-notification-agent" Mar 09 09:28:20 crc kubenswrapper[4792]: E0309 09:28:20.214587 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="sg-core" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.214595 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="sg-core" Mar 09 09:28:20 crc kubenswrapper[4792]: E0309 09:28:20.214606 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="proxy-httpd" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.214613 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="proxy-httpd" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.220455 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="sg-core" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.220511 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="ceilometer-notification-agent" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.220532 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="ceilometer-central-agent" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.220548 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" containerName="proxy-httpd" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.222398 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.225383 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.225574 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.242236 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.263354 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.263398 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-scripts\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.263501 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-config-data\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.263564 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlcfh\" (UniqueName: \"kubernetes.io/projected/628e933e-3a59-466e-af4c-b6cf5b81e9b5-kube-api-access-nlcfh\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.263596 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/628e933e-3a59-466e-af4c-b6cf5b81e9b5-run-httpd\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.263631 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/628e933e-3a59-466e-af4c-b6cf5b81e9b5-log-httpd\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.263687 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.365758 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-scripts\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.365828 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.366573 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-config-data\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.366620 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlcfh\" (UniqueName: \"kubernetes.io/projected/628e933e-3a59-466e-af4c-b6cf5b81e9b5-kube-api-access-nlcfh\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.366646 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/628e933e-3a59-466e-af4c-b6cf5b81e9b5-run-httpd\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.366668 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/628e933e-3a59-466e-af4c-b6cf5b81e9b5-log-httpd\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.366690 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.370451 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/628e933e-3a59-466e-af4c-b6cf5b81e9b5-run-httpd\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.370878 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/628e933e-3a59-466e-af4c-b6cf5b81e9b5-log-httpd\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.372153 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.372886 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.378102 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-config-data\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.378962 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-scripts\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.395159 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlcfh\" (UniqueName: \"kubernetes.io/projected/628e933e-3a59-466e-af4c-b6cf5b81e9b5-kube-api-access-nlcfh\") pod \"ceilometer-0\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.545807 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.843556 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-57vwq" event={"ID":"65c96452-6003-4585-b71c-dbb398600617","Type":"ContainerStarted","Data":"589ed64a1d5e4f2bcf291e17ee6c49b018a5751d6338667218b83689c812481d"} Mar 09 09:28:20 crc kubenswrapper[4792]: I0309 09:28:20.877150 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:21 crc kubenswrapper[4792]: I0309 09:28:21.676130 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b40b963a-896c-4714-b8cd-8e68d0c3251a" path="/var/lib/kubelet/pods/b40b963a-896c-4714-b8cd-8e68d0c3251a/volumes" Mar 09 09:28:21 crc kubenswrapper[4792]: I0309 09:28:21.876389 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"628e933e-3a59-466e-af4c-b6cf5b81e9b5","Type":"ContainerStarted","Data":"1d1061c8ccacfe2469a2e4a75422a08d7db5f0a94fdc8a612e91e546b7057251"} Mar 09 09:28:22 crc kubenswrapper[4792]: I0309 09:28:22.171216 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:22 crc kubenswrapper[4792]: I0309 09:28:22.888857 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"628e933e-3a59-466e-af4c-b6cf5b81e9b5","Type":"ContainerStarted","Data":"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d"} Mar 09 09:28:23 crc kubenswrapper[4792]: I0309 09:28:23.905505 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"628e933e-3a59-466e-af4c-b6cf5b81e9b5","Type":"ContainerStarted","Data":"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4"} Mar 09 09:28:24 crc kubenswrapper[4792]: I0309 09:28:24.916486 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"628e933e-3a59-466e-af4c-b6cf5b81e9b5","Type":"ContainerStarted","Data":"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52"} Mar 09 09:28:25 crc kubenswrapper[4792]: E0309 09:28:25.574247 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd28e5b5b_e371_4a9d_8725_017aa98ac944.slice/crio-3b4509095c9866a7f9275d0f77d8faeb571c55bff7fdbc5d0a827ab3c2e12e57\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1927dc12_81aa_463b_b356_3784c72f3245.slice/crio-8bba1dc1d8cacb61b7ab27a2317c991e9d5c0b6d568d7552f99bc47434486507\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd28e5b5b_e371_4a9d_8725_017aa98ac944.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1927dc12_81aa_463b_b356_3784c72f3245.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68422fad_6d4a_4d8a_ac66_b68dd7486525.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5d6bee9_1576_4852_a0ca_f4da5d2930f9.slice\": RecentStats: unable to find data in memory cache]" Mar 09 09:28:28 crc kubenswrapper[4792]: I0309 09:28:28.394079 4792 scope.go:117] "RemoveContainer" containerID="398445458e5f6ed76028563a72dfed72bde3be9109ed7bca5f9787a23837f624" Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.028890 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"628e933e-3a59-466e-af4c-b6cf5b81e9b5","Type":"ContainerStarted","Data":"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607"} Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.033437 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.029502 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="proxy-httpd" containerID="cri-o://0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607" gracePeriod=30 Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.029522 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="sg-core" containerID="cri-o://3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52" gracePeriod=30 Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.029535 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="ceilometer-notification-agent" containerID="cri-o://b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4" gracePeriod=30 Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.029005 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="ceilometer-central-agent" containerID="cri-o://f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d" gracePeriod=30 Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.039637 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-57vwq" event={"ID":"65c96452-6003-4585-b71c-dbb398600617","Type":"ContainerStarted","Data":"cb03b28b5d07fd12acb7d4043a1662fbb06abb286134f85927a0d0cf4166e655"} Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.100853 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-57vwq" podStartSLOduration=1.657268666 podStartE2EDuration="14.100828196s" podCreationTimestamp="2026-03-09 09:28:19 +0000 UTC" firstStartedPulling="2026-03-09 09:28:20.095789935 +0000 UTC m=+1265.125990687" lastFinishedPulling="2026-03-09 09:28:32.539349465 +0000 UTC m=+1277.569550217" observedRunningTime="2026-03-09 09:28:33.077614374 +0000 UTC m=+1278.107815126" watchObservedRunningTime="2026-03-09 09:28:33.100828196 +0000 UTC m=+1278.131028948" Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.103226 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.483780603 podStartE2EDuration="13.103214056s" podCreationTimestamp="2026-03-09 09:28:20 +0000 UTC" firstStartedPulling="2026-03-09 09:28:20.909402118 +0000 UTC m=+1265.939602870" lastFinishedPulling="2026-03-09 09:28:32.528835571 +0000 UTC m=+1277.559036323" observedRunningTime="2026-03-09 09:28:33.057004906 +0000 UTC m=+1278.087205658" watchObservedRunningTime="2026-03-09 09:28:33.103214056 +0000 UTC m=+1278.133414798" Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.919031 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.963507 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlcfh\" (UniqueName: \"kubernetes.io/projected/628e933e-3a59-466e-af4c-b6cf5b81e9b5-kube-api-access-nlcfh\") pod \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.963616 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-config-data\") pod \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.963651 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-combined-ca-bundle\") pod \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.963679 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-sg-core-conf-yaml\") pod \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.963736 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/628e933e-3a59-466e-af4c-b6cf5b81e9b5-log-httpd\") pod \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.963789 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/628e933e-3a59-466e-af4c-b6cf5b81e9b5-run-httpd\") pod \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.963891 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-scripts\") pod \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\" (UID: \"628e933e-3a59-466e-af4c-b6cf5b81e9b5\") " Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.968818 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/628e933e-3a59-466e-af4c-b6cf5b81e9b5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "628e933e-3a59-466e-af4c-b6cf5b81e9b5" (UID: "628e933e-3a59-466e-af4c-b6cf5b81e9b5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.969030 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/628e933e-3a59-466e-af4c-b6cf5b81e9b5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "628e933e-3a59-466e-af4c-b6cf5b81e9b5" (UID: "628e933e-3a59-466e-af4c-b6cf5b81e9b5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.972378 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/628e933e-3a59-466e-af4c-b6cf5b81e9b5-kube-api-access-nlcfh" (OuterVolumeSpecName: "kube-api-access-nlcfh") pod "628e933e-3a59-466e-af4c-b6cf5b81e9b5" (UID: "628e933e-3a59-466e-af4c-b6cf5b81e9b5"). InnerVolumeSpecName "kube-api-access-nlcfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:33 crc kubenswrapper[4792]: I0309 09:28:33.975192 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-scripts" (OuterVolumeSpecName: "scripts") pod "628e933e-3a59-466e-af4c-b6cf5b81e9b5" (UID: "628e933e-3a59-466e-af4c-b6cf5b81e9b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.008402 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "628e933e-3a59-466e-af4c-b6cf5b81e9b5" (UID: "628e933e-3a59-466e-af4c-b6cf5b81e9b5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.053306 4792 generic.go:334] "Generic (PLEG): container finished" podID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerID="0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607" exitCode=0 Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.053382 4792 generic.go:334] "Generic (PLEG): container finished" podID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerID="3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52" exitCode=2 Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.053392 4792 generic.go:334] "Generic (PLEG): container finished" podID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerID="b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4" exitCode=0 Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.053401 4792 generic.go:334] "Generic (PLEG): container finished" podID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerID="f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d" exitCode=0 Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.053417 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.053364 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"628e933e-3a59-466e-af4c-b6cf5b81e9b5","Type":"ContainerDied","Data":"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607"} Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.053482 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"628e933e-3a59-466e-af4c-b6cf5b81e9b5","Type":"ContainerDied","Data":"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52"} Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.053493 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"628e933e-3a59-466e-af4c-b6cf5b81e9b5","Type":"ContainerDied","Data":"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4"} Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.053503 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"628e933e-3a59-466e-af4c-b6cf5b81e9b5","Type":"ContainerDied","Data":"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d"} Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.053511 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"628e933e-3a59-466e-af4c-b6cf5b81e9b5","Type":"ContainerDied","Data":"1d1061c8ccacfe2469a2e4a75422a08d7db5f0a94fdc8a612e91e546b7057251"} Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.053527 4792 scope.go:117] "RemoveContainer" containerID="0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.053542 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "628e933e-3a59-466e-af4c-b6cf5b81e9b5" (UID: "628e933e-3a59-466e-af4c-b6cf5b81e9b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.066464 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.066495 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.066506 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/628e933e-3a59-466e-af4c-b6cf5b81e9b5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.066517 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/628e933e-3a59-466e-af4c-b6cf5b81e9b5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.066528 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.066537 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlcfh\" (UniqueName: \"kubernetes.io/projected/628e933e-3a59-466e-af4c-b6cf5b81e9b5-kube-api-access-nlcfh\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.071858 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-config-data" (OuterVolumeSpecName: "config-data") pod "628e933e-3a59-466e-af4c-b6cf5b81e9b5" (UID: "628e933e-3a59-466e-af4c-b6cf5b81e9b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.079656 4792 scope.go:117] "RemoveContainer" containerID="3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.110417 4792 scope.go:117] "RemoveContainer" containerID="b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.136723 4792 scope.go:117] "RemoveContainer" containerID="f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.168514 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/628e933e-3a59-466e-af4c-b6cf5b81e9b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.171847 4792 scope.go:117] "RemoveContainer" containerID="0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607" Mar 09 09:28:34 crc kubenswrapper[4792]: E0309 09:28:34.172769 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607\": container with ID starting with 0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607 not found: ID does not exist" containerID="0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.172848 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607"} err="failed to get container status \"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607\": rpc error: code = NotFound desc = could not find container \"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607\": container with ID starting with 0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607 not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.172885 4792 scope.go:117] "RemoveContainer" containerID="3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52" Mar 09 09:28:34 crc kubenswrapper[4792]: E0309 09:28:34.173431 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52\": container with ID starting with 3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52 not found: ID does not exist" containerID="3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.173465 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52"} err="failed to get container status \"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52\": rpc error: code = NotFound desc = could not find container \"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52\": container with ID starting with 3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52 not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.173510 4792 scope.go:117] "RemoveContainer" containerID="b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4" Mar 09 09:28:34 crc kubenswrapper[4792]: E0309 09:28:34.173869 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4\": container with ID starting with b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4 not found: ID does not exist" containerID="b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.173914 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4"} err="failed to get container status \"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4\": rpc error: code = NotFound desc = could not find container \"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4\": container with ID starting with b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4 not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.173933 4792 scope.go:117] "RemoveContainer" containerID="f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d" Mar 09 09:28:34 crc kubenswrapper[4792]: E0309 09:28:34.174840 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d\": container with ID starting with f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d not found: ID does not exist" containerID="f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.174865 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d"} err="failed to get container status \"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d\": rpc error: code = NotFound desc = could not find container \"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d\": container with ID starting with f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.174879 4792 scope.go:117] "RemoveContainer" containerID="0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.175527 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607"} err="failed to get container status \"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607\": rpc error: code = NotFound desc = could not find container \"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607\": container with ID starting with 0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607 not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.175552 4792 scope.go:117] "RemoveContainer" containerID="3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.175871 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52"} err="failed to get container status \"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52\": rpc error: code = NotFound desc = could not find container \"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52\": container with ID starting with 3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52 not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.175900 4792 scope.go:117] "RemoveContainer" containerID="b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.176153 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4"} err="failed to get container status \"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4\": rpc error: code = NotFound desc = could not find container \"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4\": container with ID starting with b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4 not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.176175 4792 scope.go:117] "RemoveContainer" containerID="f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.177384 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d"} err="failed to get container status \"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d\": rpc error: code = NotFound desc = could not find container \"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d\": container with ID starting with f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.178748 4792 scope.go:117] "RemoveContainer" containerID="0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.179357 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607"} err="failed to get container status \"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607\": rpc error: code = NotFound desc = could not find container \"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607\": container with ID starting with 0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607 not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.179389 4792 scope.go:117] "RemoveContainer" containerID="3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.180001 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52"} err="failed to get container status \"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52\": rpc error: code = NotFound desc = could not find container \"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52\": container with ID starting with 3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52 not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.180036 4792 scope.go:117] "RemoveContainer" containerID="b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.180381 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4"} err="failed to get container status \"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4\": rpc error: code = NotFound desc = could not find container \"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4\": container with ID starting with b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4 not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.180402 4792 scope.go:117] "RemoveContainer" containerID="f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.180633 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d"} err="failed to get container status \"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d\": rpc error: code = NotFound desc = could not find container \"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d\": container with ID starting with f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.180658 4792 scope.go:117] "RemoveContainer" containerID="0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.180984 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607"} err="failed to get container status \"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607\": rpc error: code = NotFound desc = could not find container \"0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607\": container with ID starting with 0113e1ce2afe0c271d85aec9ccf6505ea4edf8294e9dc12aa96fe9623bf55607 not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.181011 4792 scope.go:117] "RemoveContainer" containerID="3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.181326 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52"} err="failed to get container status \"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52\": rpc error: code = NotFound desc = could not find container \"3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52\": container with ID starting with 3b2d1718defda55c85b547cb0718807a9188b1588689d46a56ad344c31df2e52 not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.181357 4792 scope.go:117] "RemoveContainer" containerID="b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.181702 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4"} err="failed to get container status \"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4\": rpc error: code = NotFound desc = could not find container \"b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4\": container with ID starting with b7b43b573909ee32a50c2d5d57de55f370d4467b900ee0fca12091997da8deb4 not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.181728 4792 scope.go:117] "RemoveContainer" containerID="f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.182180 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d"} err="failed to get container status \"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d\": rpc error: code = NotFound desc = could not find container \"f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d\": container with ID starting with f92d5098ce7fc1d42ef17cf7608630aa36566f80bce63990e21114daeb3be56d not found: ID does not exist" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.393029 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.401018 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.429693 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:34 crc kubenswrapper[4792]: E0309 09:28:34.430182 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="ceilometer-notification-agent" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.430206 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="ceilometer-notification-agent" Mar 09 09:28:34 crc kubenswrapper[4792]: E0309 09:28:34.430226 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="proxy-httpd" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.430235 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="proxy-httpd" Mar 09 09:28:34 crc kubenswrapper[4792]: E0309 09:28:34.430260 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="ceilometer-central-agent" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.430268 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="ceilometer-central-agent" Mar 09 09:28:34 crc kubenswrapper[4792]: E0309 09:28:34.430287 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="sg-core" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.430294 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="sg-core" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.430441 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="ceilometer-notification-agent" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.430456 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="sg-core" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.430465 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="ceilometer-central-agent" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.430477 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" containerName="proxy-httpd" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.432013 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.436370 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.438558 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.455341 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.475142 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-scripts\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.475204 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0425ca30-3bee-4b08-a54c-4d60cbd32edf-run-httpd\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.475229 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0425ca30-3bee-4b08-a54c-4d60cbd32edf-log-httpd\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.475256 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtw5d\" (UniqueName: \"kubernetes.io/projected/0425ca30-3bee-4b08-a54c-4d60cbd32edf-kube-api-access-mtw5d\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.475289 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.475304 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.475366 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-config-data\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.576306 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-scripts\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.576370 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0425ca30-3bee-4b08-a54c-4d60cbd32edf-run-httpd\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.576393 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0425ca30-3bee-4b08-a54c-4d60cbd32edf-log-httpd\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.576425 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtw5d\" (UniqueName: \"kubernetes.io/projected/0425ca30-3bee-4b08-a54c-4d60cbd32edf-kube-api-access-mtw5d\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.576463 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.576481 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.576550 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-config-data\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.577025 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0425ca30-3bee-4b08-a54c-4d60cbd32edf-log-httpd\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.577179 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0425ca30-3bee-4b08-a54c-4d60cbd32edf-run-httpd\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.581228 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-scripts\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.581232 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.581663 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.592696 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-config-data\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.599737 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtw5d\" (UniqueName: \"kubernetes.io/projected/0425ca30-3bee-4b08-a54c-4d60cbd32edf-kube-api-access-mtw5d\") pod \"ceilometer-0\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " pod="openstack/ceilometer-0" Mar 09 09:28:34 crc kubenswrapper[4792]: I0309 09:28:34.791455 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:28:35 crc kubenswrapper[4792]: I0309 09:28:35.296763 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:28:35 crc kubenswrapper[4792]: I0309 09:28:35.675188 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="628e933e-3a59-466e-af4c-b6cf5b81e9b5" path="/var/lib/kubelet/pods/628e933e-3a59-466e-af4c-b6cf5b81e9b5/volumes" Mar 09 09:28:35 crc kubenswrapper[4792]: E0309 09:28:35.802165 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd28e5b5b_e371_4a9d_8725_017aa98ac944.slice/crio-3b4509095c9866a7f9275d0f77d8faeb571c55bff7fdbc5d0a827ab3c2e12e57\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd28e5b5b_e371_4a9d_8725_017aa98ac944.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5d6bee9_1576_4852_a0ca_f4da5d2930f9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1927dc12_81aa_463b_b356_3784c72f3245.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1927dc12_81aa_463b_b356_3784c72f3245.slice/crio-8bba1dc1d8cacb61b7ab27a2317c991e9d5c0b6d568d7552f99bc47434486507\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68422fad_6d4a_4d8a_ac66_b68dd7486525.slice\": RecentStats: unable to find data in memory cache]" Mar 09 09:28:36 crc kubenswrapper[4792]: I0309 09:28:36.082414 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0425ca30-3bee-4b08-a54c-4d60cbd32edf","Type":"ContainerStarted","Data":"95a2194701df02e88fcff1ab3cc4393279c53267f9ace718d654f02a53fef423"} Mar 09 09:28:36 crc kubenswrapper[4792]: I0309 09:28:36.082753 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0425ca30-3bee-4b08-a54c-4d60cbd32edf","Type":"ContainerStarted","Data":"ece602dccf65c8002751bfbc4069098a23d98603e14afce85d731cae7f300e02"} Mar 09 09:28:37 crc kubenswrapper[4792]: I0309 09:28:37.092451 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0425ca30-3bee-4b08-a54c-4d60cbd32edf","Type":"ContainerStarted","Data":"2a3848e438c6dbffee43051fbcf0d3205234a605c83601ad68937e50e94e4faf"} Mar 09 09:28:38 crc kubenswrapper[4792]: I0309 09:28:38.103881 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0425ca30-3bee-4b08-a54c-4d60cbd32edf","Type":"ContainerStarted","Data":"0b40ce2cd6f544d951e34e90c37e6b91963891ae65009ae096e9fa2c0e39d383"} Mar 09 09:28:40 crc kubenswrapper[4792]: I0309 09:28:40.121159 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0425ca30-3bee-4b08-a54c-4d60cbd32edf","Type":"ContainerStarted","Data":"5ebfbb65fac45832891c2dcd0f138cdd9b15aad2e6c7e9eceac9bd8d022f0f38"} Mar 09 09:28:40 crc kubenswrapper[4792]: I0309 09:28:40.121456 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:28:40 crc kubenswrapper[4792]: I0309 09:28:40.143644 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.141059765 podStartE2EDuration="6.143627s" podCreationTimestamp="2026-03-09 09:28:34 +0000 UTC" firstStartedPulling="2026-03-09 09:28:35.302270623 +0000 UTC m=+1280.332471375" lastFinishedPulling="2026-03-09 09:28:39.304837858 +0000 UTC m=+1284.335038610" observedRunningTime="2026-03-09 09:28:40.140946863 +0000 UTC m=+1285.171147615" watchObservedRunningTime="2026-03-09 09:28:40.143627 +0000 UTC m=+1285.173827762" Mar 09 09:28:46 crc kubenswrapper[4792]: E0309 09:28:46.028317 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1927dc12_81aa_463b_b356_3784c72f3245.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5d6bee9_1576_4852_a0ca_f4da5d2930f9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd28e5b5b_e371_4a9d_8725_017aa98ac944.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd28e5b5b_e371_4a9d_8725_017aa98ac944.slice/crio-3b4509095c9866a7f9275d0f77d8faeb571c55bff7fdbc5d0a827ab3c2e12e57\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68422fad_6d4a_4d8a_ac66_b68dd7486525.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1927dc12_81aa_463b_b356_3784c72f3245.slice/crio-8bba1dc1d8cacb61b7ab27a2317c991e9d5c0b6d568d7552f99bc47434486507\": RecentStats: unable to find data in memory cache]" Mar 09 09:28:50 crc kubenswrapper[4792]: I0309 09:28:50.212367 4792 generic.go:334] "Generic (PLEG): container finished" podID="65c96452-6003-4585-b71c-dbb398600617" containerID="cb03b28b5d07fd12acb7d4043a1662fbb06abb286134f85927a0d0cf4166e655" exitCode=0 Mar 09 09:28:50 crc kubenswrapper[4792]: I0309 09:28:50.212442 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-57vwq" event={"ID":"65c96452-6003-4585-b71c-dbb398600617","Type":"ContainerDied","Data":"cb03b28b5d07fd12acb7d4043a1662fbb06abb286134f85927a0d0cf4166e655"} Mar 09 09:28:51 crc kubenswrapper[4792]: I0309 09:28:51.561355 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:51 crc kubenswrapper[4792]: I0309 09:28:51.683937 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc9vg\" (UniqueName: \"kubernetes.io/projected/65c96452-6003-4585-b71c-dbb398600617-kube-api-access-xc9vg\") pod \"65c96452-6003-4585-b71c-dbb398600617\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " Mar 09 09:28:51 crc kubenswrapper[4792]: I0309 09:28:51.684024 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-combined-ca-bundle\") pod \"65c96452-6003-4585-b71c-dbb398600617\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " Mar 09 09:28:51 crc kubenswrapper[4792]: I0309 09:28:51.684052 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-config-data\") pod \"65c96452-6003-4585-b71c-dbb398600617\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " Mar 09 09:28:51 crc kubenswrapper[4792]: I0309 09:28:51.684319 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-scripts\") pod \"65c96452-6003-4585-b71c-dbb398600617\" (UID: \"65c96452-6003-4585-b71c-dbb398600617\") " Mar 09 09:28:51 crc kubenswrapper[4792]: I0309 09:28:51.689612 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c96452-6003-4585-b71c-dbb398600617-kube-api-access-xc9vg" (OuterVolumeSpecName: "kube-api-access-xc9vg") pod "65c96452-6003-4585-b71c-dbb398600617" (UID: "65c96452-6003-4585-b71c-dbb398600617"). InnerVolumeSpecName "kube-api-access-xc9vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:51 crc kubenswrapper[4792]: I0309 09:28:51.689739 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-scripts" (OuterVolumeSpecName: "scripts") pod "65c96452-6003-4585-b71c-dbb398600617" (UID: "65c96452-6003-4585-b71c-dbb398600617"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:51 crc kubenswrapper[4792]: I0309 09:28:51.710284 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-config-data" (OuterVolumeSpecName: "config-data") pod "65c96452-6003-4585-b71c-dbb398600617" (UID: "65c96452-6003-4585-b71c-dbb398600617"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:51 crc kubenswrapper[4792]: I0309 09:28:51.710440 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65c96452-6003-4585-b71c-dbb398600617" (UID: "65c96452-6003-4585-b71c-dbb398600617"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:51 crc kubenswrapper[4792]: I0309 09:28:51.786291 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:51 crc kubenswrapper[4792]: I0309 09:28:51.786332 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc9vg\" (UniqueName: \"kubernetes.io/projected/65c96452-6003-4585-b71c-dbb398600617-kube-api-access-xc9vg\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:51 crc kubenswrapper[4792]: I0309 09:28:51.786342 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:51 crc kubenswrapper[4792]: I0309 09:28:51.786352 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c96452-6003-4585-b71c-dbb398600617-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.233166 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-57vwq" event={"ID":"65c96452-6003-4585-b71c-dbb398600617","Type":"ContainerDied","Data":"589ed64a1d5e4f2bcf291e17ee6c49b018a5751d6338667218b83689c812481d"} Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.233491 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="589ed64a1d5e4f2bcf291e17ee6c49b018a5751d6338667218b83689c812481d" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.233220 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-57vwq" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.339270 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 09:28:52 crc kubenswrapper[4792]: E0309 09:28:52.339657 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c96452-6003-4585-b71c-dbb398600617" containerName="nova-cell0-conductor-db-sync" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.339678 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c96452-6003-4585-b71c-dbb398600617" containerName="nova-cell0-conductor-db-sync" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.339916 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="65c96452-6003-4585-b71c-dbb398600617" containerName="nova-cell0-conductor-db-sync" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.340566 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.343965 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6w5mj" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.344100 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.356846 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.398058 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f67afd-db61-4209-b505-8ec8edcabfc1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"33f67afd-db61-4209-b505-8ec8edcabfc1\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.398176 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9rtj\" (UniqueName: \"kubernetes.io/projected/33f67afd-db61-4209-b505-8ec8edcabfc1-kube-api-access-t9rtj\") pod \"nova-cell0-conductor-0\" (UID: \"33f67afd-db61-4209-b505-8ec8edcabfc1\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.398457 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f67afd-db61-4209-b505-8ec8edcabfc1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"33f67afd-db61-4209-b505-8ec8edcabfc1\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.501174 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f67afd-db61-4209-b505-8ec8edcabfc1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"33f67afd-db61-4209-b505-8ec8edcabfc1\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.501244 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9rtj\" (UniqueName: \"kubernetes.io/projected/33f67afd-db61-4209-b505-8ec8edcabfc1-kube-api-access-t9rtj\") pod \"nova-cell0-conductor-0\" (UID: \"33f67afd-db61-4209-b505-8ec8edcabfc1\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.501274 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f67afd-db61-4209-b505-8ec8edcabfc1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"33f67afd-db61-4209-b505-8ec8edcabfc1\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.506872 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33f67afd-db61-4209-b505-8ec8edcabfc1-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"33f67afd-db61-4209-b505-8ec8edcabfc1\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.507518 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33f67afd-db61-4209-b505-8ec8edcabfc1-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"33f67afd-db61-4209-b505-8ec8edcabfc1\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.518542 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9rtj\" (UniqueName: \"kubernetes.io/projected/33f67afd-db61-4209-b505-8ec8edcabfc1-kube-api-access-t9rtj\") pod \"nova-cell0-conductor-0\" (UID: \"33f67afd-db61-4209-b505-8ec8edcabfc1\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.659224 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 09:28:52 crc kubenswrapper[4792]: I0309 09:28:52.949054 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 09:28:52 crc kubenswrapper[4792]: W0309 09:28:52.955501 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33f67afd_db61_4209_b505_8ec8edcabfc1.slice/crio-0749dc0ba0e7b2eabd87e77935e28b5f6d1ada606b49cec26e1f6705faf58213 WatchSource:0}: Error finding container 0749dc0ba0e7b2eabd87e77935e28b5f6d1ada606b49cec26e1f6705faf58213: Status 404 returned error can't find the container with id 0749dc0ba0e7b2eabd87e77935e28b5f6d1ada606b49cec26e1f6705faf58213 Mar 09 09:28:53 crc kubenswrapper[4792]: I0309 09:28:53.243277 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"33f67afd-db61-4209-b505-8ec8edcabfc1","Type":"ContainerStarted","Data":"526b4b9933265554431f9c646e6717cc61f3e390e76cab98fdcc07ec1b7fefde"} Mar 09 09:28:53 crc kubenswrapper[4792]: I0309 09:28:53.243643 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 09 09:28:53 crc kubenswrapper[4792]: I0309 09:28:53.243660 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"33f67afd-db61-4209-b505-8ec8edcabfc1","Type":"ContainerStarted","Data":"0749dc0ba0e7b2eabd87e77935e28b5f6d1ada606b49cec26e1f6705faf58213"} Mar 09 09:28:53 crc kubenswrapper[4792]: I0309 09:28:53.265776 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.265758968 podStartE2EDuration="1.265758968s" podCreationTimestamp="2026-03-09 09:28:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:28:53.260904028 +0000 UTC m=+1298.291104820" watchObservedRunningTime="2026-03-09 09:28:53.265758968 +0000 UTC m=+1298.295959720" Mar 09 09:28:56 crc kubenswrapper[4792]: E0309 09:28:56.258982 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd28e5b5b_e371_4a9d_8725_017aa98ac944.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5d6bee9_1576_4852_a0ca_f4da5d2930f9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68422fad_6d4a_4d8a_ac66_b68dd7486525.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1927dc12_81aa_463b_b356_3784c72f3245.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1927dc12_81aa_463b_b356_3784c72f3245.slice/crio-8bba1dc1d8cacb61b7ab27a2317c991e9d5c0b6d568d7552f99bc47434486507\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd28e5b5b_e371_4a9d_8725_017aa98ac944.slice/crio-3b4509095c9866a7f9275d0f77d8faeb571c55bff7fdbc5d0a827ab3c2e12e57\": RecentStats: unable to find data in memory cache]" Mar 09 09:29:02 crc kubenswrapper[4792]: I0309 09:29:02.686479 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.188236 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xv7wk"] Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.189917 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.197977 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.198023 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.213724 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xv7wk"] Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.311633 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-scripts\") pod \"nova-cell0-cell-mapping-xv7wk\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.311790 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl5f8\" (UniqueName: \"kubernetes.io/projected/b5277744-0423-4554-8512-da2a35eaaafd-kube-api-access-gl5f8\") pod \"nova-cell0-cell-mapping-xv7wk\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.311840 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xv7wk\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.311864 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-config-data\") pod \"nova-cell0-cell-mapping-xv7wk\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.377838 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.381782 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.385672 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.393020 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.412973 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl5f8\" (UniqueName: \"kubernetes.io/projected/b5277744-0423-4554-8512-da2a35eaaafd-kube-api-access-gl5f8\") pod \"nova-cell0-cell-mapping-xv7wk\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.413108 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xv7wk\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.413131 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-config-data\") pod \"nova-cell0-cell-mapping-xv7wk\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.414091 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-scripts\") pod \"nova-cell0-cell-mapping-xv7wk\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.419022 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xv7wk\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.419539 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-config-data\") pod \"nova-cell0-cell-mapping-xv7wk\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.421181 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-scripts\") pod \"nova-cell0-cell-mapping-xv7wk\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.452830 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl5f8\" (UniqueName: \"kubernetes.io/projected/b5277744-0423-4554-8512-da2a35eaaafd-kube-api-access-gl5f8\") pod \"nova-cell0-cell-mapping-xv7wk\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.512042 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.516268 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-logs\") pod \"nova-api-0\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " pod="openstack/nova-api-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.516374 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k68hk\" (UniqueName: \"kubernetes.io/projected/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-kube-api-access-k68hk\") pod \"nova-api-0\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " pod="openstack/nova-api-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.516521 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-config-data\") pod \"nova-api-0\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " pod="openstack/nova-api-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.516611 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " pod="openstack/nova-api-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.523152 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.524471 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.527383 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.573752 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.620358 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-logs\") pod \"nova-api-0\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " pod="openstack/nova-api-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.620404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392f6124-a359-44d2-b9ba-2176b7a3debd-config-data\") pod \"nova-scheduler-0\" (UID: \"392f6124-a359-44d2-b9ba-2176b7a3debd\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.620469 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k68hk\" (UniqueName: \"kubernetes.io/projected/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-kube-api-access-k68hk\") pod \"nova-api-0\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " pod="openstack/nova-api-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.620546 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392f6124-a359-44d2-b9ba-2176b7a3debd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"392f6124-a359-44d2-b9ba-2176b7a3debd\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.620798 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-logs\") pod \"nova-api-0\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " pod="openstack/nova-api-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.620856 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-config-data\") pod \"nova-api-0\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " pod="openstack/nova-api-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.620913 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " pod="openstack/nova-api-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.620964 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk442\" (UniqueName: \"kubernetes.io/projected/392f6124-a359-44d2-b9ba-2176b7a3debd-kube-api-access-xk442\") pod \"nova-scheduler-0\" (UID: \"392f6124-a359-44d2-b9ba-2176b7a3debd\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.653592 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " pod="openstack/nova-api-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.659027 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-config-data\") pod \"nova-api-0\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " pod="openstack/nova-api-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.712851 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k68hk\" (UniqueName: \"kubernetes.io/projected/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-kube-api-access-k68hk\") pod \"nova-api-0\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " pod="openstack/nova-api-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.722501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392f6124-a359-44d2-b9ba-2176b7a3debd-config-data\") pod \"nova-scheduler-0\" (UID: \"392f6124-a359-44d2-b9ba-2176b7a3debd\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.722619 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392f6124-a359-44d2-b9ba-2176b7a3debd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"392f6124-a359-44d2-b9ba-2176b7a3debd\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.722707 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk442\" (UniqueName: \"kubernetes.io/projected/392f6124-a359-44d2-b9ba-2176b7a3debd-kube-api-access-xk442\") pod \"nova-scheduler-0\" (UID: \"392f6124-a359-44d2-b9ba-2176b7a3debd\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.733821 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.737139 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392f6124-a359-44d2-b9ba-2176b7a3debd-config-data\") pod \"nova-scheduler-0\" (UID: \"392f6124-a359-44d2-b9ba-2176b7a3debd\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.761839 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.762332 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392f6124-a359-44d2-b9ba-2176b7a3debd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"392f6124-a359-44d2-b9ba-2176b7a3debd\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.769602 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.796498 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.824719 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltl5g\" (UniqueName: \"kubernetes.io/projected/941a2d27-f778-4bd5-9d47-00a485b24e44-kube-api-access-ltl5g\") pod \"nova-metadata-0\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " pod="openstack/nova-metadata-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.824764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941a2d27-f778-4bd5-9d47-00a485b24e44-config-data\") pod \"nova-metadata-0\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " pod="openstack/nova-metadata-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.824820 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941a2d27-f778-4bd5-9d47-00a485b24e44-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " pod="openstack/nova-metadata-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.824862 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/941a2d27-f778-4bd5-9d47-00a485b24e44-logs\") pod \"nova-metadata-0\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " pod="openstack/nova-metadata-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.904409 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk442\" (UniqueName: \"kubernetes.io/projected/392f6124-a359-44d2-b9ba-2176b7a3debd-kube-api-access-xk442\") pod \"nova-scheduler-0\" (UID: \"392f6124-a359-44d2-b9ba-2176b7a3debd\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.953147 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltl5g\" (UniqueName: \"kubernetes.io/projected/941a2d27-f778-4bd5-9d47-00a485b24e44-kube-api-access-ltl5g\") pod \"nova-metadata-0\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " pod="openstack/nova-metadata-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.953222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941a2d27-f778-4bd5-9d47-00a485b24e44-config-data\") pod \"nova-metadata-0\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " pod="openstack/nova-metadata-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.953325 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941a2d27-f778-4bd5-9d47-00a485b24e44-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " pod="openstack/nova-metadata-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.953410 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/941a2d27-f778-4bd5-9d47-00a485b24e44-logs\") pod \"nova-metadata-0\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " pod="openstack/nova-metadata-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.955813 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/941a2d27-f778-4bd5-9d47-00a485b24e44-logs\") pod \"nova-metadata-0\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " pod="openstack/nova-metadata-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.977961 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941a2d27-f778-4bd5-9d47-00a485b24e44-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " pod="openstack/nova-metadata-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.978680 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941a2d27-f778-4bd5-9d47-00a485b24e44-config-data\") pod \"nova-metadata-0\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " pod="openstack/nova-metadata-0" Mar 09 09:29:03 crc kubenswrapper[4792]: I0309 09:29:03.998425 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.000913 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltl5g\" (UniqueName: \"kubernetes.io/projected/941a2d27-f778-4bd5-9d47-00a485b24e44-kube-api-access-ltl5g\") pod \"nova-metadata-0\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " pod="openstack/nova-metadata-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.020547 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.042158 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-skh6c"] Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.045190 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.056750 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-skh6c"] Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.083236 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.084381 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.091966 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.094349 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.133671 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.159417 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.160010 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-config\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.160168 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.160283 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.160434 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-dns-svc\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.160956 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78nzp\" (UniqueName: \"kubernetes.io/projected/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-kube-api-access-78nzp\") pod \"nova-cell1-novncproxy-0\" (UID: \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.161050 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.161412 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kw88\" (UniqueName: \"kubernetes.io/projected/ede6e819-d6bc-48e7-9fcd-4534b432fab5-kube-api-access-4kw88\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.263171 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-dns-svc\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.263260 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78nzp\" (UniqueName: \"kubernetes.io/projected/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-kube-api-access-78nzp\") pod \"nova-cell1-novncproxy-0\" (UID: \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.263292 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.263365 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kw88\" (UniqueName: \"kubernetes.io/projected/ede6e819-d6bc-48e7-9fcd-4534b432fab5-kube-api-access-4kw88\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.263395 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.263427 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-config\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.263451 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.263485 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.265141 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.265296 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-config\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.265363 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.265418 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-dns-svc\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.276192 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.287678 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.296883 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kw88\" (UniqueName: \"kubernetes.io/projected/ede6e819-d6bc-48e7-9fcd-4534b432fab5-kube-api-access-4kw88\") pod \"dnsmasq-dns-7ff5b4cd7c-skh6c\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.304381 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78nzp\" (UniqueName: \"kubernetes.io/projected/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-kube-api-access-78nzp\") pod \"nova-cell1-novncproxy-0\" (UID: \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.410926 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.448571 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.464213 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xv7wk"] Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.831046 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 09:29:04 crc kubenswrapper[4792]: I0309 09:29:04.974028 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:04.998429 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:29:05 crc kubenswrapper[4792]: W0309 09:29:05.031939 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod392f6124_a359_44d2_b9ba_2176b7a3debd.slice/crio-c4abb8eec696dfcda7dbfc4fc194e4138c2ad3f52a415436feccf71d7b9c63d9 WatchSource:0}: Error finding container c4abb8eec696dfcda7dbfc4fc194e4138c2ad3f52a415436feccf71d7b9c63d9: Status 404 returned error can't find the container with id c4abb8eec696dfcda7dbfc4fc194e4138c2ad3f52a415436feccf71d7b9c63d9 Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.202745 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.360350 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38935b95-760a-4b84-bb4a-0a89f6b2d1ff","Type":"ContainerStarted","Data":"51472b9980e1a67b4cda57b44043142efedac98a47d90e928395b0510edb6e42"} Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.367307 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xv7wk" event={"ID":"b5277744-0423-4554-8512-da2a35eaaafd","Type":"ContainerStarted","Data":"55b4abdb4e5c3b239676c5bb57c8cabb00688f5febc4fdf58e595a1c329a41e7"} Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.367371 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xv7wk" event={"ID":"b5277744-0423-4554-8512-da2a35eaaafd","Type":"ContainerStarted","Data":"185413276ebcf29c54868ec92e961c0602aeb9b9eda0fc337554dacbc0412bb7"} Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.372907 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"392f6124-a359-44d2-b9ba-2176b7a3debd","Type":"ContainerStarted","Data":"c4abb8eec696dfcda7dbfc4fc194e4138c2ad3f52a415436feccf71d7b9c63d9"} Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.375947 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"941a2d27-f778-4bd5-9d47-00a485b24e44","Type":"ContainerStarted","Data":"edb699e6f134ce75639f16e977d4e7796626f75cdb546c0b8df44ec947f613c1"} Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.399143 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-skh6c"] Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.410199 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xv7wk" podStartSLOduration=2.410174284 podStartE2EDuration="2.410174284s" podCreationTimestamp="2026-03-09 09:29:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:29:05.391011818 +0000 UTC m=+1310.421212580" watchObservedRunningTime="2026-03-09 09:29:05.410174284 +0000 UTC m=+1310.440375056" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.548644 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-622mt"] Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.550804 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.558800 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-622mt"] Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.632477 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjwh8\" (UniqueName: \"kubernetes.io/projected/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-kube-api-access-pjwh8\") pod \"nova-cell1-conductor-db-sync-622mt\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.632524 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-scripts\") pod \"nova-cell1-conductor-db-sync-622mt\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.632571 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-config-data\") pod \"nova-cell1-conductor-db-sync-622mt\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.632719 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-622mt\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.734208 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-622mt\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.734309 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjwh8\" (UniqueName: \"kubernetes.io/projected/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-kube-api-access-pjwh8\") pod \"nova-cell1-conductor-db-sync-622mt\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.734334 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-scripts\") pod \"nova-cell1-conductor-db-sync-622mt\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.734368 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-config-data\") pod \"nova-cell1-conductor-db-sync-622mt\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.738802 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-622mt\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.843662 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.843879 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.867063 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-config-data\") pod \"nova-cell1-conductor-db-sync-622mt\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.877275 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-scripts\") pod \"nova-cell1-conductor-db-sync-622mt\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.894013 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:29:05 crc kubenswrapper[4792]: I0309 09:29:05.904669 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjwh8\" (UniqueName: \"kubernetes.io/projected/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-kube-api-access-pjwh8\") pod \"nova-cell1-conductor-db-sync-622mt\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:06 crc kubenswrapper[4792]: I0309 09:29:06.001027 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:06 crc kubenswrapper[4792]: I0309 09:29:06.733519 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85","Type":"ContainerStarted","Data":"c1009e44913c1bd3fd937ea3b2d9e72ad5de37996074f012fb3d5f6663a137af"} Mar 09 09:29:06 crc kubenswrapper[4792]: I0309 09:29:06.742290 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" event={"ID":"ede6e819-d6bc-48e7-9fcd-4534b432fab5","Type":"ContainerStarted","Data":"c0aef1412990186e6d818b251f7451479cb51d77cb487ba68bd5b9619dbdfe0c"} Mar 09 09:29:07 crc kubenswrapper[4792]: E0309 09:29:07.134055 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68422fad_6d4a_4d8a_ac66_b68dd7486525.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1927dc12_81aa_463b_b356_3784c72f3245.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1927dc12_81aa_463b_b356_3784c72f3245.slice/crio-8bba1dc1d8cacb61b7ab27a2317c991e9d5c0b6d568d7552f99bc47434486507\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5d6bee9_1576_4852_a0ca_f4da5d2930f9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd28e5b5b_e371_4a9d_8725_017aa98ac944.slice/crio-3b4509095c9866a7f9275d0f77d8faeb571c55bff7fdbc5d0a827ab3c2e12e57\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd28e5b5b_e371_4a9d_8725_017aa98ac944.slice\": RecentStats: unable to find data in memory cache]" Mar 09 09:29:07 crc kubenswrapper[4792]: I0309 09:29:07.289919 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-622mt"] Mar 09 09:29:07 crc kubenswrapper[4792]: W0309 09:29:07.333694 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5f70ffc_88a1_470d_923c_78a1cc6aba7b.slice/crio-9b8be80a68ea4fa8956052f0b35881f4166825555742ed88179d9a55458c3284 WatchSource:0}: Error finding container 9b8be80a68ea4fa8956052f0b35881f4166825555742ed88179d9a55458c3284: Status 404 returned error can't find the container with id 9b8be80a68ea4fa8956052f0b35881f4166825555742ed88179d9a55458c3284 Mar 09 09:29:07 crc kubenswrapper[4792]: I0309 09:29:07.758004 4792 generic.go:334] "Generic (PLEG): container finished" podID="ede6e819-d6bc-48e7-9fcd-4534b432fab5" containerID="a21d578029b3d8d71f88d0a552ead8993fb9148411576079d13b34d0ac8cef46" exitCode=0 Mar 09 09:29:07 crc kubenswrapper[4792]: I0309 09:29:07.758103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" event={"ID":"ede6e819-d6bc-48e7-9fcd-4534b432fab5","Type":"ContainerDied","Data":"a21d578029b3d8d71f88d0a552ead8993fb9148411576079d13b34d0ac8cef46"} Mar 09 09:29:07 crc kubenswrapper[4792]: I0309 09:29:07.773683 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-622mt" event={"ID":"f5f70ffc-88a1-470d-923c-78a1cc6aba7b","Type":"ContainerStarted","Data":"4a974df632b929a8818d4b5a23dffeee8227f916cb8145de0821d944b5d816a5"} Mar 09 09:29:07 crc kubenswrapper[4792]: I0309 09:29:07.773753 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-622mt" event={"ID":"f5f70ffc-88a1-470d-923c-78a1cc6aba7b","Type":"ContainerStarted","Data":"9b8be80a68ea4fa8956052f0b35881f4166825555742ed88179d9a55458c3284"} Mar 09 09:29:07 crc kubenswrapper[4792]: I0309 09:29:07.840804 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-622mt" podStartSLOduration=2.8399417099999997 podStartE2EDuration="2.83994171s" podCreationTimestamp="2026-03-09 09:29:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:29:07.82854893 +0000 UTC m=+1312.858749682" watchObservedRunningTime="2026-03-09 09:29:07.83994171 +0000 UTC m=+1312.870142462" Mar 09 09:29:08 crc kubenswrapper[4792]: I0309 09:29:08.799748 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" event={"ID":"ede6e819-d6bc-48e7-9fcd-4534b432fab5","Type":"ContainerStarted","Data":"649196a299eda67a28ce5035dc76bf1d4c60efcaec0eb8885c04a6cf9624ace6"} Mar 09 09:29:08 crc kubenswrapper[4792]: I0309 09:29:08.800130 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:08 crc kubenswrapper[4792]: I0309 09:29:08.844014 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" podStartSLOduration=5.843992475 podStartE2EDuration="5.843992475s" podCreationTimestamp="2026-03-09 09:29:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:29:08.832957006 +0000 UTC m=+1313.863157758" watchObservedRunningTime="2026-03-09 09:29:08.843992475 +0000 UTC m=+1313.874193227" Mar 09 09:29:09 crc kubenswrapper[4792]: I0309 09:29:09.111161 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:29:09 crc kubenswrapper[4792]: I0309 09:29:09.129685 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:29:11 crc kubenswrapper[4792]: I0309 09:29:11.453943 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:29:11 crc kubenswrapper[4792]: I0309 09:29:11.454620 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d" containerName="kube-state-metrics" containerID="cri-o://18970ff52b5e8d8ebbafed712093856d88a669424e4f1698ec7d17801b83624b" gracePeriod=30 Mar 09 09:29:11 crc kubenswrapper[4792]: I0309 09:29:11.841111 4792 generic.go:334] "Generic (PLEG): container finished" podID="b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d" containerID="18970ff52b5e8d8ebbafed712093856d88a669424e4f1698ec7d17801b83624b" exitCode=2 Mar 09 09:29:11 crc kubenswrapper[4792]: I0309 09:29:11.841419 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d","Type":"ContainerDied","Data":"18970ff52b5e8d8ebbafed712093856d88a669424e4f1698ec7d17801b83624b"} Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.278581 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.457769 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g98j9\" (UniqueName: \"kubernetes.io/projected/b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d-kube-api-access-g98j9\") pod \"b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d\" (UID: \"b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d\") " Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.475512 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d-kube-api-access-g98j9" (OuterVolumeSpecName: "kube-api-access-g98j9") pod "b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d" (UID: "b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d"). InnerVolumeSpecName "kube-api-access-g98j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.560132 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g98j9\" (UniqueName: \"kubernetes.io/projected/b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d-kube-api-access-g98j9\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.852995 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38935b95-760a-4b84-bb4a-0a89f6b2d1ff","Type":"ContainerStarted","Data":"4611c277a21ba9569d4e27268131ce972a02816bb406dc8b74438c2b8f145cb3"} Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.853042 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38935b95-760a-4b84-bb4a-0a89f6b2d1ff","Type":"ContainerStarted","Data":"f52ce08b2b3cf4db8261893d951429b67067d659d42866fcc46cdae0021c2e60"} Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.856331 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.856321 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d","Type":"ContainerDied","Data":"943f255292d634f95fab9cfa91dfa16cf2dee232ee8164544178b2da4eec4c84"} Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.856935 4792 scope.go:117] "RemoveContainer" containerID="18970ff52b5e8d8ebbafed712093856d88a669424e4f1698ec7d17801b83624b" Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.858311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"392f6124-a359-44d2-b9ba-2176b7a3debd","Type":"ContainerStarted","Data":"913d0de3b6f9b44689feb2573bf2725dab6c0f3e7a2266f132e60d88a150a4f2"} Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.862965 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"941a2d27-f778-4bd5-9d47-00a485b24e44","Type":"ContainerStarted","Data":"f5d7f734e56fe8acacefd3d006d7c0507336aabf243cfe706500fdbc7950200e"} Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.863042 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="941a2d27-f778-4bd5-9d47-00a485b24e44" containerName="nova-metadata-log" containerID="cri-o://ce977c4f40c8af00d93860170dcb0a171b69506e355ae57941711080b0b4d0b5" gracePeriod=30 Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.863159 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"941a2d27-f778-4bd5-9d47-00a485b24e44","Type":"ContainerStarted","Data":"ce977c4f40c8af00d93860170dcb0a171b69506e355ae57941711080b0b4d0b5"} Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.863103 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="941a2d27-f778-4bd5-9d47-00a485b24e44" containerName="nova-metadata-metadata" containerID="cri-o://f5d7f734e56fe8acacefd3d006d7c0507336aabf243cfe706500fdbc7950200e" gracePeriod=30 Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.873502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85","Type":"ContainerStarted","Data":"4b4106e70eddba98c76e1a5ea72d5327d16d50543e5162660561fe897e84ded6"} Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.873660 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="16d5eb0a-cb03-4a91-999b-8bbd16a0cf85" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4b4106e70eddba98c76e1a5ea72d5327d16d50543e5162660561fe897e84ded6" gracePeriod=30 Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.884609 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.059048099 podStartE2EDuration="9.884589283s" podCreationTimestamp="2026-03-09 09:29:03 +0000 UTC" firstStartedPulling="2026-03-09 09:29:05.038421703 +0000 UTC m=+1310.068622455" lastFinishedPulling="2026-03-09 09:29:11.863962887 +0000 UTC m=+1316.894163639" observedRunningTime="2026-03-09 09:29:12.880215356 +0000 UTC m=+1317.910416138" watchObservedRunningTime="2026-03-09 09:29:12.884589283 +0000 UTC m=+1317.914790035" Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.905635 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.959457418 podStartE2EDuration="9.905612232s" podCreationTimestamp="2026-03-09 09:29:03 +0000 UTC" firstStartedPulling="2026-03-09 09:29:05.914887709 +0000 UTC m=+1310.945088451" lastFinishedPulling="2026-03-09 09:29:11.861042513 +0000 UTC m=+1316.891243265" observedRunningTime="2026-03-09 09:29:12.904440218 +0000 UTC m=+1317.934640990" watchObservedRunningTime="2026-03-09 09:29:12.905612232 +0000 UTC m=+1317.935812984" Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.953879 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.315692852 podStartE2EDuration="9.953855792s" podCreationTimestamp="2026-03-09 09:29:03 +0000 UTC" firstStartedPulling="2026-03-09 09:29:05.229927477 +0000 UTC m=+1310.260128229" lastFinishedPulling="2026-03-09 09:29:11.868090417 +0000 UTC m=+1316.898291169" observedRunningTime="2026-03-09 09:29:12.936621071 +0000 UTC m=+1317.966821823" watchObservedRunningTime="2026-03-09 09:29:12.953855792 +0000 UTC m=+1317.984056544" Mar 09 09:29:12 crc kubenswrapper[4792]: I0309 09:29:12.973018 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.169178881 podStartE2EDuration="9.972994606s" podCreationTimestamp="2026-03-09 09:29:03 +0000 UTC" firstStartedPulling="2026-03-09 09:29:05.059420142 +0000 UTC m=+1310.089620894" lastFinishedPulling="2026-03-09 09:29:11.863235867 +0000 UTC m=+1316.893436619" observedRunningTime="2026-03-09 09:29:12.968790884 +0000 UTC m=+1317.998991636" watchObservedRunningTime="2026-03-09 09:29:12.972994606 +0000 UTC m=+1318.003195358" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.051560 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.090469 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.135006 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:29:13 crc kubenswrapper[4792]: E0309 09:29:13.135495 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d" containerName="kube-state-metrics" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.135528 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d" containerName="kube-state-metrics" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.137127 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d" containerName="kube-state-metrics" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.138682 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.141360 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.142376 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.156058 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.290375 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/da658112-b6e4-4493-a46a-0add09e299f6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"da658112-b6e4-4493-a46a-0add09e299f6\") " pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.290502 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24w2q\" (UniqueName: \"kubernetes.io/projected/da658112-b6e4-4493-a46a-0add09e299f6-kube-api-access-24w2q\") pod \"kube-state-metrics-0\" (UID: \"da658112-b6e4-4493-a46a-0add09e299f6\") " pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.290704 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da658112-b6e4-4493-a46a-0add09e299f6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"da658112-b6e4-4493-a46a-0add09e299f6\") " pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.290929 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/da658112-b6e4-4493-a46a-0add09e299f6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"da658112-b6e4-4493-a46a-0add09e299f6\") " pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.393259 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24w2q\" (UniqueName: \"kubernetes.io/projected/da658112-b6e4-4493-a46a-0add09e299f6-kube-api-access-24w2q\") pod \"kube-state-metrics-0\" (UID: \"da658112-b6e4-4493-a46a-0add09e299f6\") " pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.393409 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da658112-b6e4-4493-a46a-0add09e299f6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"da658112-b6e4-4493-a46a-0add09e299f6\") " pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.393480 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/da658112-b6e4-4493-a46a-0add09e299f6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"da658112-b6e4-4493-a46a-0add09e299f6\") " pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.393528 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/da658112-b6e4-4493-a46a-0add09e299f6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"da658112-b6e4-4493-a46a-0add09e299f6\") " pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.400207 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da658112-b6e4-4493-a46a-0add09e299f6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"da658112-b6e4-4493-a46a-0add09e299f6\") " pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.401015 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/da658112-b6e4-4493-a46a-0add09e299f6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"da658112-b6e4-4493-a46a-0add09e299f6\") " pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.401026 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/da658112-b6e4-4493-a46a-0add09e299f6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"da658112-b6e4-4493-a46a-0add09e299f6\") " pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.417332 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24w2q\" (UniqueName: \"kubernetes.io/projected/da658112-b6e4-4493-a46a-0add09e299f6-kube-api-access-24w2q\") pod \"kube-state-metrics-0\" (UID: \"da658112-b6e4-4493-a46a-0add09e299f6\") " pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.464414 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.601714 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.602049 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="ceilometer-central-agent" containerID="cri-o://95a2194701df02e88fcff1ab3cc4393279c53267f9ace718d654f02a53fef423" gracePeriod=30 Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.602565 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="proxy-httpd" containerID="cri-o://5ebfbb65fac45832891c2dcd0f138cdd9b15aad2e6c7e9eceac9bd8d022f0f38" gracePeriod=30 Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.602631 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="sg-core" containerID="cri-o://0b40ce2cd6f544d951e34e90c37e6b91963891ae65009ae096e9fa2c0e39d383" gracePeriod=30 Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.602677 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="ceilometer-notification-agent" containerID="cri-o://2a3848e438c6dbffee43051fbcf0d3205234a605c83601ad68937e50e94e4faf" gracePeriod=30 Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.680138 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d" path="/var/lib/kubelet/pods/b03cd03b-4cb1-4f61-8c17-e9f9b2986e8d/volumes" Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.923791 4792 generic.go:334] "Generic (PLEG): container finished" podID="941a2d27-f778-4bd5-9d47-00a485b24e44" containerID="ce977c4f40c8af00d93860170dcb0a171b69506e355ae57941711080b0b4d0b5" exitCode=143 Mar 09 09:29:13 crc kubenswrapper[4792]: I0309 09:29:13.926243 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"941a2d27-f778-4bd5-9d47-00a485b24e44","Type":"ContainerDied","Data":"ce977c4f40c8af00d93860170dcb0a171b69506e355ae57941711080b0b4d0b5"} Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:13.999972 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.000018 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.023730 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.023830 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.101509 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.101561 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.103347 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.280913 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.414634 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.455186 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.503711 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7675674687-7sdtd"] Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.513510 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7675674687-7sdtd" podUID="143ee92a-26eb-464c-9fb5-6adbe64b31e5" containerName="dnsmasq-dns" containerID="cri-o://406149531b4ee4071941618d21a0e78eaaf69465357a64e61e63f6fdb4ca46e9" gracePeriod=10 Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.936185 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"da658112-b6e4-4493-a46a-0add09e299f6","Type":"ContainerStarted","Data":"71743d4d54306b442357abe31d4925c8695e1edb3198af28e917742351e5ba97"} Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.944696 4792 generic.go:334] "Generic (PLEG): container finished" podID="143ee92a-26eb-464c-9fb5-6adbe64b31e5" containerID="406149531b4ee4071941618d21a0e78eaaf69465357a64e61e63f6fdb4ca46e9" exitCode=0 Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.944778 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-7sdtd" event={"ID":"143ee92a-26eb-464c-9fb5-6adbe64b31e5","Type":"ContainerDied","Data":"406149531b4ee4071941618d21a0e78eaaf69465357a64e61e63f6fdb4ca46e9"} Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.947748 4792 generic.go:334] "Generic (PLEG): container finished" podID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerID="0b40ce2cd6f544d951e34e90c37e6b91963891ae65009ae096e9fa2c0e39d383" exitCode=2 Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.947776 4792 generic.go:334] "Generic (PLEG): container finished" podID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerID="95a2194701df02e88fcff1ab3cc4393279c53267f9ace718d654f02a53fef423" exitCode=0 Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.954671 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0425ca30-3bee-4b08-a54c-4d60cbd32edf","Type":"ContainerDied","Data":"0b40ce2cd6f544d951e34e90c37e6b91963891ae65009ae096e9fa2c0e39d383"} Mar 09 09:29:14 crc kubenswrapper[4792]: I0309 09:29:14.954723 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0425ca30-3bee-4b08-a54c-4d60cbd32edf","Type":"ContainerDied","Data":"95a2194701df02e88fcff1ab3cc4393279c53267f9ace718d654f02a53fef423"} Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.051667 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.086367 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="38935b95-760a-4b84-bb4a-0a89f6b2d1ff" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.175:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.086687 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="38935b95-760a-4b84-bb4a-0a89f6b2d1ff" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.175:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.266509 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.446852 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrphw\" (UniqueName: \"kubernetes.io/projected/143ee92a-26eb-464c-9fb5-6adbe64b31e5-kube-api-access-zrphw\") pod \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.446927 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-ovsdbserver-sb\") pod \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.447088 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-ovsdbserver-nb\") pod \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.447129 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-dns-svc\") pod \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.447190 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-config\") pod \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\" (UID: \"143ee92a-26eb-464c-9fb5-6adbe64b31e5\") " Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.458061 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/143ee92a-26eb-464c-9fb5-6adbe64b31e5-kube-api-access-zrphw" (OuterVolumeSpecName: "kube-api-access-zrphw") pod "143ee92a-26eb-464c-9fb5-6adbe64b31e5" (UID: "143ee92a-26eb-464c-9fb5-6adbe64b31e5"). InnerVolumeSpecName "kube-api-access-zrphw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.553567 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrphw\" (UniqueName: \"kubernetes.io/projected/143ee92a-26eb-464c-9fb5-6adbe64b31e5-kube-api-access-zrphw\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.581671 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "143ee92a-26eb-464c-9fb5-6adbe64b31e5" (UID: "143ee92a-26eb-464c-9fb5-6adbe64b31e5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.611412 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "143ee92a-26eb-464c-9fb5-6adbe64b31e5" (UID: "143ee92a-26eb-464c-9fb5-6adbe64b31e5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.614167 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "143ee92a-26eb-464c-9fb5-6adbe64b31e5" (UID: "143ee92a-26eb-464c-9fb5-6adbe64b31e5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.653625 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-config" (OuterVolumeSpecName: "config") pod "143ee92a-26eb-464c-9fb5-6adbe64b31e5" (UID: "143ee92a-26eb-464c-9fb5-6adbe64b31e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.691874 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.691916 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.691927 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.691936 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/143ee92a-26eb-464c-9fb5-6adbe64b31e5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.962359 4792 generic.go:334] "Generic (PLEG): container finished" podID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerID="5ebfbb65fac45832891c2dcd0f138cdd9b15aad2e6c7e9eceac9bd8d022f0f38" exitCode=0 Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.962741 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0425ca30-3bee-4b08-a54c-4d60cbd32edf","Type":"ContainerDied","Data":"5ebfbb65fac45832891c2dcd0f138cdd9b15aad2e6c7e9eceac9bd8d022f0f38"} Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.964933 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7675674687-7sdtd" event={"ID":"143ee92a-26eb-464c-9fb5-6adbe64b31e5","Type":"ContainerDied","Data":"ca765f0ed135e4ed5f385004a74f853f3b9ce90e6af2993f95ebab0d16c1af36"} Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.964993 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7675674687-7sdtd" Mar 09 09:29:15 crc kubenswrapper[4792]: I0309 09:29:15.965155 4792 scope.go:117] "RemoveContainer" containerID="406149531b4ee4071941618d21a0e78eaaf69465357a64e61e63f6fdb4ca46e9" Mar 09 09:29:16 crc kubenswrapper[4792]: I0309 09:29:16.007756 4792 scope.go:117] "RemoveContainer" containerID="19f9875412ad1d0ad9ceca08c40385abe736b591a5fbd07cc2f258fe77b56361" Mar 09 09:29:16 crc kubenswrapper[4792]: I0309 09:29:16.009645 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7675674687-7sdtd"] Mar 09 09:29:16 crc kubenswrapper[4792]: I0309 09:29:16.024617 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7675674687-7sdtd"] Mar 09 09:29:16 crc kubenswrapper[4792]: I0309 09:29:16.981837 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"da658112-b6e4-4493-a46a-0add09e299f6","Type":"ContainerStarted","Data":"2a97792327d6fc040cbb799e722f6d4085924fe68854d43672ffb289e770087a"} Mar 09 09:29:16 crc kubenswrapper[4792]: I0309 09:29:16.982427 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 09:29:17 crc kubenswrapper[4792]: I0309 09:29:17.067537 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.8941338119999998 podStartE2EDuration="4.067514837s" podCreationTimestamp="2026-03-09 09:29:13 +0000 UTC" firstStartedPulling="2026-03-09 09:29:14.28151544 +0000 UTC m=+1319.311716192" lastFinishedPulling="2026-03-09 09:29:15.454896465 +0000 UTC m=+1320.485097217" observedRunningTime="2026-03-09 09:29:17.051089601 +0000 UTC m=+1322.081290353" watchObservedRunningTime="2026-03-09 09:29:17.067514837 +0000 UTC m=+1322.097715589" Mar 09 09:29:17 crc kubenswrapper[4792]: I0309 09:29:17.692891 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="143ee92a-26eb-464c-9fb5-6adbe64b31e5" path="/var/lib/kubelet/pods/143ee92a-26eb-464c-9fb5-6adbe64b31e5/volumes" Mar 09 09:29:20 crc kubenswrapper[4792]: I0309 09:29:20.012296 4792 generic.go:334] "Generic (PLEG): container finished" podID="b5277744-0423-4554-8512-da2a35eaaafd" containerID="55b4abdb4e5c3b239676c5bb57c8cabb00688f5febc4fdf58e595a1c329a41e7" exitCode=0 Mar 09 09:29:20 crc kubenswrapper[4792]: I0309 09:29:20.012385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xv7wk" event={"ID":"b5277744-0423-4554-8512-da2a35eaaafd","Type":"ContainerDied","Data":"55b4abdb4e5c3b239676c5bb57c8cabb00688f5febc4fdf58e595a1c329a41e7"} Mar 09 09:29:20 crc kubenswrapper[4792]: I0309 09:29:20.923211 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.040588 4792 generic.go:334] "Generic (PLEG): container finished" podID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerID="2a3848e438c6dbffee43051fbcf0d3205234a605c83601ad68937e50e94e4faf" exitCode=0 Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.040667 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.040678 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0425ca30-3bee-4b08-a54c-4d60cbd32edf","Type":"ContainerDied","Data":"2a3848e438c6dbffee43051fbcf0d3205234a605c83601ad68937e50e94e4faf"} Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.040715 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0425ca30-3bee-4b08-a54c-4d60cbd32edf","Type":"ContainerDied","Data":"ece602dccf65c8002751bfbc4069098a23d98603e14afce85d731cae7f300e02"} Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.040756 4792 scope.go:117] "RemoveContainer" containerID="5ebfbb65fac45832891c2dcd0f138cdd9b15aad2e6c7e9eceac9bd8d022f0f38" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.096111 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-config-data\") pod \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.096237 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0425ca30-3bee-4b08-a54c-4d60cbd32edf-log-httpd\") pod \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.096255 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0425ca30-3bee-4b08-a54c-4d60cbd32edf-run-httpd\") pod \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.096383 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtw5d\" (UniqueName: \"kubernetes.io/projected/0425ca30-3bee-4b08-a54c-4d60cbd32edf-kube-api-access-mtw5d\") pod \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.096436 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-scripts\") pod \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.096489 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-sg-core-conf-yaml\") pod \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.096509 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-combined-ca-bundle\") pod \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\" (UID: \"0425ca30-3bee-4b08-a54c-4d60cbd32edf\") " Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.097132 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0425ca30-3bee-4b08-a54c-4d60cbd32edf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0425ca30-3bee-4b08-a54c-4d60cbd32edf" (UID: "0425ca30-3bee-4b08-a54c-4d60cbd32edf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.097458 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0425ca30-3bee-4b08-a54c-4d60cbd32edf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0425ca30-3bee-4b08-a54c-4d60cbd32edf" (UID: "0425ca30-3bee-4b08-a54c-4d60cbd32edf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.114057 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0425ca30-3bee-4b08-a54c-4d60cbd32edf-kube-api-access-mtw5d" (OuterVolumeSpecName: "kube-api-access-mtw5d") pod "0425ca30-3bee-4b08-a54c-4d60cbd32edf" (UID: "0425ca30-3bee-4b08-a54c-4d60cbd32edf"). InnerVolumeSpecName "kube-api-access-mtw5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.117593 4792 scope.go:117] "RemoveContainer" containerID="0b40ce2cd6f544d951e34e90c37e6b91963891ae65009ae096e9fa2c0e39d383" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.133762 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-scripts" (OuterVolumeSpecName: "scripts") pod "0425ca30-3bee-4b08-a54c-4d60cbd32edf" (UID: "0425ca30-3bee-4b08-a54c-4d60cbd32edf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.164378 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0425ca30-3bee-4b08-a54c-4d60cbd32edf" (UID: "0425ca30-3bee-4b08-a54c-4d60cbd32edf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.201978 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0425ca30-3bee-4b08-a54c-4d60cbd32edf-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.202011 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0425ca30-3bee-4b08-a54c-4d60cbd32edf-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.202020 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtw5d\" (UniqueName: \"kubernetes.io/projected/0425ca30-3bee-4b08-a54c-4d60cbd32edf-kube-api-access-mtw5d\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.202030 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.202039 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.244424 4792 scope.go:117] "RemoveContainer" containerID="2a3848e438c6dbffee43051fbcf0d3205234a605c83601ad68937e50e94e4faf" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.347206 4792 scope.go:117] "RemoveContainer" containerID="95a2194701df02e88fcff1ab3cc4393279c53267f9ace718d654f02a53fef423" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.350334 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0425ca30-3bee-4b08-a54c-4d60cbd32edf" (UID: "0425ca30-3bee-4b08-a54c-4d60cbd32edf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.403144 4792 scope.go:117] "RemoveContainer" containerID="5ebfbb65fac45832891c2dcd0f138cdd9b15aad2e6c7e9eceac9bd8d022f0f38" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.409543 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:21 crc kubenswrapper[4792]: E0309 09:29:21.409630 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ebfbb65fac45832891c2dcd0f138cdd9b15aad2e6c7e9eceac9bd8d022f0f38\": container with ID starting with 5ebfbb65fac45832891c2dcd0f138cdd9b15aad2e6c7e9eceac9bd8d022f0f38 not found: ID does not exist" containerID="5ebfbb65fac45832891c2dcd0f138cdd9b15aad2e6c7e9eceac9bd8d022f0f38" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.409679 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ebfbb65fac45832891c2dcd0f138cdd9b15aad2e6c7e9eceac9bd8d022f0f38"} err="failed to get container status \"5ebfbb65fac45832891c2dcd0f138cdd9b15aad2e6c7e9eceac9bd8d022f0f38\": rpc error: code = NotFound desc = could not find container \"5ebfbb65fac45832891c2dcd0f138cdd9b15aad2e6c7e9eceac9bd8d022f0f38\": container with ID starting with 5ebfbb65fac45832891c2dcd0f138cdd9b15aad2e6c7e9eceac9bd8d022f0f38 not found: ID does not exist" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.409710 4792 scope.go:117] "RemoveContainer" containerID="0b40ce2cd6f544d951e34e90c37e6b91963891ae65009ae096e9fa2c0e39d383" Mar 09 09:29:21 crc kubenswrapper[4792]: E0309 09:29:21.415431 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b40ce2cd6f544d951e34e90c37e6b91963891ae65009ae096e9fa2c0e39d383\": container with ID starting with 0b40ce2cd6f544d951e34e90c37e6b91963891ae65009ae096e9fa2c0e39d383 not found: ID does not exist" containerID="0b40ce2cd6f544d951e34e90c37e6b91963891ae65009ae096e9fa2c0e39d383" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.415484 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b40ce2cd6f544d951e34e90c37e6b91963891ae65009ae096e9fa2c0e39d383"} err="failed to get container status \"0b40ce2cd6f544d951e34e90c37e6b91963891ae65009ae096e9fa2c0e39d383\": rpc error: code = NotFound desc = could not find container \"0b40ce2cd6f544d951e34e90c37e6b91963891ae65009ae096e9fa2c0e39d383\": container with ID starting with 0b40ce2cd6f544d951e34e90c37e6b91963891ae65009ae096e9fa2c0e39d383 not found: ID does not exist" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.415515 4792 scope.go:117] "RemoveContainer" containerID="2a3848e438c6dbffee43051fbcf0d3205234a605c83601ad68937e50e94e4faf" Mar 09 09:29:21 crc kubenswrapper[4792]: E0309 09:29:21.419275 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a3848e438c6dbffee43051fbcf0d3205234a605c83601ad68937e50e94e4faf\": container with ID starting with 2a3848e438c6dbffee43051fbcf0d3205234a605c83601ad68937e50e94e4faf not found: ID does not exist" containerID="2a3848e438c6dbffee43051fbcf0d3205234a605c83601ad68937e50e94e4faf" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.419317 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a3848e438c6dbffee43051fbcf0d3205234a605c83601ad68937e50e94e4faf"} err="failed to get container status \"2a3848e438c6dbffee43051fbcf0d3205234a605c83601ad68937e50e94e4faf\": rpc error: code = NotFound desc = could not find container \"2a3848e438c6dbffee43051fbcf0d3205234a605c83601ad68937e50e94e4faf\": container with ID starting with 2a3848e438c6dbffee43051fbcf0d3205234a605c83601ad68937e50e94e4faf not found: ID does not exist" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.419346 4792 scope.go:117] "RemoveContainer" containerID="95a2194701df02e88fcff1ab3cc4393279c53267f9ace718d654f02a53fef423" Mar 09 09:29:21 crc kubenswrapper[4792]: E0309 09:29:21.421725 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95a2194701df02e88fcff1ab3cc4393279c53267f9ace718d654f02a53fef423\": container with ID starting with 95a2194701df02e88fcff1ab3cc4393279c53267f9ace718d654f02a53fef423 not found: ID does not exist" containerID="95a2194701df02e88fcff1ab3cc4393279c53267f9ace718d654f02a53fef423" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.421765 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95a2194701df02e88fcff1ab3cc4393279c53267f9ace718d654f02a53fef423"} err="failed to get container status \"95a2194701df02e88fcff1ab3cc4393279c53267f9ace718d654f02a53fef423\": rpc error: code = NotFound desc = could not find container \"95a2194701df02e88fcff1ab3cc4393279c53267f9ace718d654f02a53fef423\": container with ID starting with 95a2194701df02e88fcff1ab3cc4393279c53267f9ace718d654f02a53fef423 not found: ID does not exist" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.431455 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-config-data" (OuterVolumeSpecName: "config-data") pod "0425ca30-3bee-4b08-a54c-4d60cbd32edf" (UID: "0425ca30-3bee-4b08-a54c-4d60cbd32edf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.513273 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0425ca30-3bee-4b08-a54c-4d60cbd32edf-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.600710 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.688174 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.689815 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.717129 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-combined-ca-bundle\") pod \"b5277744-0423-4554-8512-da2a35eaaafd\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.717187 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.717213 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-scripts\") pod \"b5277744-0423-4554-8512-da2a35eaaafd\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.717262 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-config-data\") pod \"b5277744-0423-4554-8512-da2a35eaaafd\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.717325 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl5f8\" (UniqueName: \"kubernetes.io/projected/b5277744-0423-4554-8512-da2a35eaaafd-kube-api-access-gl5f8\") pod \"b5277744-0423-4554-8512-da2a35eaaafd\" (UID: \"b5277744-0423-4554-8512-da2a35eaaafd\") " Mar 09 09:29:21 crc kubenswrapper[4792]: E0309 09:29:21.717630 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143ee92a-26eb-464c-9fb5-6adbe64b31e5" containerName="dnsmasq-dns" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.717650 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="143ee92a-26eb-464c-9fb5-6adbe64b31e5" containerName="dnsmasq-dns" Mar 09 09:29:21 crc kubenswrapper[4792]: E0309 09:29:21.717673 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="ceilometer-central-agent" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.717681 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="ceilometer-central-agent" Mar 09 09:29:21 crc kubenswrapper[4792]: E0309 09:29:21.717689 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5277744-0423-4554-8512-da2a35eaaafd" containerName="nova-manage" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.717696 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5277744-0423-4554-8512-da2a35eaaafd" containerName="nova-manage" Mar 09 09:29:21 crc kubenswrapper[4792]: E0309 09:29:21.717717 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="sg-core" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.717723 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="sg-core" Mar 09 09:29:21 crc kubenswrapper[4792]: E0309 09:29:21.717735 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="ceilometer-notification-agent" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.717741 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="ceilometer-notification-agent" Mar 09 09:29:21 crc kubenswrapper[4792]: E0309 09:29:21.717756 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="proxy-httpd" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.717761 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="proxy-httpd" Mar 09 09:29:21 crc kubenswrapper[4792]: E0309 09:29:21.717770 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143ee92a-26eb-464c-9fb5-6adbe64b31e5" containerName="init" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.717777 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="143ee92a-26eb-464c-9fb5-6adbe64b31e5" containerName="init" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.717977 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="ceilometer-notification-agent" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.718003 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="sg-core" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.718026 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="ceilometer-central-agent" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.718038 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" containerName="proxy-httpd" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.718058 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5277744-0423-4554-8512-da2a35eaaafd" containerName="nova-manage" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.718070 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="143ee92a-26eb-464c-9fb5-6adbe64b31e5" containerName="dnsmasq-dns" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.726903 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.728079 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5277744-0423-4554-8512-da2a35eaaafd-kube-api-access-gl5f8" (OuterVolumeSpecName: "kube-api-access-gl5f8") pod "b5277744-0423-4554-8512-da2a35eaaafd" (UID: "b5277744-0423-4554-8512-da2a35eaaafd"). InnerVolumeSpecName "kube-api-access-gl5f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.729712 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-scripts" (OuterVolumeSpecName: "scripts") pod "b5277744-0423-4554-8512-da2a35eaaafd" (UID: "b5277744-0423-4554-8512-da2a35eaaafd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.731132 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.738523 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.738630 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.738797 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.800294 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5277744-0423-4554-8512-da2a35eaaafd" (UID: "b5277744-0423-4554-8512-da2a35eaaafd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.804570 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-config-data" (OuterVolumeSpecName: "config-data") pod "b5277744-0423-4554-8512-da2a35eaaafd" (UID: "b5277744-0423-4554-8512-da2a35eaaafd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.823817 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.823844 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.823853 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5277744-0423-4554-8512-da2a35eaaafd-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.823863 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl5f8\" (UniqueName: \"kubernetes.io/projected/b5277744-0423-4554-8512-da2a35eaaafd-kube-api-access-gl5f8\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.925902 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnbzt\" (UniqueName: \"kubernetes.io/projected/d557a1d7-8c39-4058-82d7-f290f1ad60b0-kube-api-access-rnbzt\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.925952 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-config-data\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.926022 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d557a1d7-8c39-4058-82d7-f290f1ad60b0-run-httpd\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.926053 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-scripts\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.926084 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.926145 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.926180 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d557a1d7-8c39-4058-82d7-f290f1ad60b0-log-httpd\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:21 crc kubenswrapper[4792]: I0309 09:29:21.926219 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.027413 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnbzt\" (UniqueName: \"kubernetes.io/projected/d557a1d7-8c39-4058-82d7-f290f1ad60b0-kube-api-access-rnbzt\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.027470 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-config-data\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.027535 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d557a1d7-8c39-4058-82d7-f290f1ad60b0-run-httpd\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.027571 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-scripts\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.027599 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.027630 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.027659 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d557a1d7-8c39-4058-82d7-f290f1ad60b0-log-httpd\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.027707 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.031595 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d557a1d7-8c39-4058-82d7-f290f1ad60b0-run-httpd\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.031712 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d557a1d7-8c39-4058-82d7-f290f1ad60b0-log-httpd\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.035908 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-scripts\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.036160 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.036846 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.037962 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-config-data\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.055659 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.060199 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnbzt\" (UniqueName: \"kubernetes.io/projected/d557a1d7-8c39-4058-82d7-f290f1ad60b0-kube-api-access-rnbzt\") pod \"ceilometer-0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.102296 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xv7wk" event={"ID":"b5277744-0423-4554-8512-da2a35eaaafd","Type":"ContainerDied","Data":"185413276ebcf29c54868ec92e961c0602aeb9b9eda0fc337554dacbc0412bb7"} Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.102384 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="185413276ebcf29c54868ec92e961c0602aeb9b9eda0fc337554dacbc0412bb7" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.102535 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xv7wk" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.173694 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.361019 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.361636 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="38935b95-760a-4b84-bb4a-0a89f6b2d1ff" containerName="nova-api-log" containerID="cri-o://f52ce08b2b3cf4db8261893d951429b67067d659d42866fcc46cdae0021c2e60" gracePeriod=30 Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.362208 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="38935b95-760a-4b84-bb4a-0a89f6b2d1ff" containerName="nova-api-api" containerID="cri-o://4611c277a21ba9569d4e27268131ce972a02816bb406dc8b74438c2b8f145cb3" gracePeriod=30 Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.371085 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.371606 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="392f6124-a359-44d2-b9ba-2176b7a3debd" containerName="nova-scheduler-scheduler" containerID="cri-o://913d0de3b6f9b44689feb2573bf2725dab6c0f3e7a2266f132e60d88a150a4f2" gracePeriod=30 Mar 09 09:29:22 crc kubenswrapper[4792]: I0309 09:29:22.726474 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:29:23 crc kubenswrapper[4792]: I0309 09:29:23.112689 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d557a1d7-8c39-4058-82d7-f290f1ad60b0","Type":"ContainerStarted","Data":"91b56dbd7c1c3d2ffcc937846779e454a85ea88f690b5018d99bc40a0df192f4"} Mar 09 09:29:23 crc kubenswrapper[4792]: I0309 09:29:23.115753 4792 generic.go:334] "Generic (PLEG): container finished" podID="38935b95-760a-4b84-bb4a-0a89f6b2d1ff" containerID="f52ce08b2b3cf4db8261893d951429b67067d659d42866fcc46cdae0021c2e60" exitCode=143 Mar 09 09:29:23 crc kubenswrapper[4792]: I0309 09:29:23.115795 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38935b95-760a-4b84-bb4a-0a89f6b2d1ff","Type":"ContainerDied","Data":"f52ce08b2b3cf4db8261893d951429b67067d659d42866fcc46cdae0021c2e60"} Mar 09 09:29:23 crc kubenswrapper[4792]: I0309 09:29:23.535380 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 09:29:23 crc kubenswrapper[4792]: I0309 09:29:23.676461 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0425ca30-3bee-4b08-a54c-4d60cbd32edf" path="/var/lib/kubelet/pods/0425ca30-3bee-4b08-a54c-4d60cbd32edf/volumes" Mar 09 09:29:24 crc kubenswrapper[4792]: E0309 09:29:24.024199 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="913d0de3b6f9b44689feb2573bf2725dab6c0f3e7a2266f132e60d88a150a4f2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 09:29:24 crc kubenswrapper[4792]: E0309 09:29:24.026322 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="913d0de3b6f9b44689feb2573bf2725dab6c0f3e7a2266f132e60d88a150a4f2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 09:29:24 crc kubenswrapper[4792]: E0309 09:29:24.028758 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="913d0de3b6f9b44689feb2573bf2725dab6c0f3e7a2266f132e60d88a150a4f2" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 09:29:24 crc kubenswrapper[4792]: E0309 09:29:24.028850 4792 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="392f6124-a359-44d2-b9ba-2176b7a3debd" containerName="nova-scheduler-scheduler" Mar 09 09:29:24 crc kubenswrapper[4792]: I0309 09:29:24.126793 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d557a1d7-8c39-4058-82d7-f290f1ad60b0","Type":"ContainerStarted","Data":"45122f2fb7a038895fcdb736b2370681efcc74a0dbbe428f1652b521a02b9cf8"} Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.137460 4792 generic.go:334] "Generic (PLEG): container finished" podID="392f6124-a359-44d2-b9ba-2176b7a3debd" containerID="913d0de3b6f9b44689feb2573bf2725dab6c0f3e7a2266f132e60d88a150a4f2" exitCode=0 Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.137517 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"392f6124-a359-44d2-b9ba-2176b7a3debd","Type":"ContainerDied","Data":"913d0de3b6f9b44689feb2573bf2725dab6c0f3e7a2266f132e60d88a150a4f2"} Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.138117 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"392f6124-a359-44d2-b9ba-2176b7a3debd","Type":"ContainerDied","Data":"c4abb8eec696dfcda7dbfc4fc194e4138c2ad3f52a415436feccf71d7b9c63d9"} Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.138143 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4abb8eec696dfcda7dbfc4fc194e4138c2ad3f52a415436feccf71d7b9c63d9" Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.140959 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d557a1d7-8c39-4058-82d7-f290f1ad60b0","Type":"ContainerStarted","Data":"e65146499b15666d6b2f32336aef02d3365db5921060e8e322121085ec54bb62"} Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.185701 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.295968 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392f6124-a359-44d2-b9ba-2176b7a3debd-combined-ca-bundle\") pod \"392f6124-a359-44d2-b9ba-2176b7a3debd\" (UID: \"392f6124-a359-44d2-b9ba-2176b7a3debd\") " Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.296244 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk442\" (UniqueName: \"kubernetes.io/projected/392f6124-a359-44d2-b9ba-2176b7a3debd-kube-api-access-xk442\") pod \"392f6124-a359-44d2-b9ba-2176b7a3debd\" (UID: \"392f6124-a359-44d2-b9ba-2176b7a3debd\") " Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.296324 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392f6124-a359-44d2-b9ba-2176b7a3debd-config-data\") pod \"392f6124-a359-44d2-b9ba-2176b7a3debd\" (UID: \"392f6124-a359-44d2-b9ba-2176b7a3debd\") " Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.312756 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392f6124-a359-44d2-b9ba-2176b7a3debd-kube-api-access-xk442" (OuterVolumeSpecName: "kube-api-access-xk442") pod "392f6124-a359-44d2-b9ba-2176b7a3debd" (UID: "392f6124-a359-44d2-b9ba-2176b7a3debd"). InnerVolumeSpecName "kube-api-access-xk442". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.360309 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392f6124-a359-44d2-b9ba-2176b7a3debd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "392f6124-a359-44d2-b9ba-2176b7a3debd" (UID: "392f6124-a359-44d2-b9ba-2176b7a3debd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.362916 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392f6124-a359-44d2-b9ba-2176b7a3debd-config-data" (OuterVolumeSpecName: "config-data") pod "392f6124-a359-44d2-b9ba-2176b7a3debd" (UID: "392f6124-a359-44d2-b9ba-2176b7a3debd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.398959 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk442\" (UniqueName: \"kubernetes.io/projected/392f6124-a359-44d2-b9ba-2176b7a3debd-kube-api-access-xk442\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.399019 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392f6124-a359-44d2-b9ba-2176b7a3debd-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:25 crc kubenswrapper[4792]: I0309 09:29:25.399035 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392f6124-a359-44d2-b9ba-2176b7a3debd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.175520 4792 generic.go:334] "Generic (PLEG): container finished" podID="38935b95-760a-4b84-bb4a-0a89f6b2d1ff" containerID="4611c277a21ba9569d4e27268131ce972a02816bb406dc8b74438c2b8f145cb3" exitCode=0 Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.175610 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38935b95-760a-4b84-bb4a-0a89f6b2d1ff","Type":"ContainerDied","Data":"4611c277a21ba9569d4e27268131ce972a02816bb406dc8b74438c2b8f145cb3"} Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.182880 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.183665 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d557a1d7-8c39-4058-82d7-f290f1ad60b0","Type":"ContainerStarted","Data":"54015b43ca8b4b1e8e9e69282aea4a91ea1b52223fce1b5fe77f15a07d3d07cd"} Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.224902 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.244391 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.256246 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:29:26 crc kubenswrapper[4792]: E0309 09:29:26.256687 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392f6124-a359-44d2-b9ba-2176b7a3debd" containerName="nova-scheduler-scheduler" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.256711 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="392f6124-a359-44d2-b9ba-2176b7a3debd" containerName="nova-scheduler-scheduler" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.256914 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="392f6124-a359-44d2-b9ba-2176b7a3debd" containerName="nova-scheduler-scheduler" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.257593 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.266146 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.288149 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.374825 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.420111 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvqvf\" (UniqueName: \"kubernetes.io/projected/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-kube-api-access-zvqvf\") pod \"nova-scheduler-0\" (UID: \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.420229 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.420313 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-config-data\") pod \"nova-scheduler-0\" (UID: \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.521349 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-logs\") pod \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.521444 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k68hk\" (UniqueName: \"kubernetes.io/projected/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-kube-api-access-k68hk\") pod \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.521468 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-combined-ca-bundle\") pod \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.521601 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-config-data\") pod \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\" (UID: \"38935b95-760a-4b84-bb4a-0a89f6b2d1ff\") " Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.521804 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvqvf\" (UniqueName: \"kubernetes.io/projected/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-kube-api-access-zvqvf\") pod \"nova-scheduler-0\" (UID: \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.521907 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.521953 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-config-data\") pod \"nova-scheduler-0\" (UID: \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.521954 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-logs" (OuterVolumeSpecName: "logs") pod "38935b95-760a-4b84-bb4a-0a89f6b2d1ff" (UID: "38935b95-760a-4b84-bb4a-0a89f6b2d1ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.528928 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-kube-api-access-k68hk" (OuterVolumeSpecName: "kube-api-access-k68hk") pod "38935b95-760a-4b84-bb4a-0a89f6b2d1ff" (UID: "38935b95-760a-4b84-bb4a-0a89f6b2d1ff"). InnerVolumeSpecName "kube-api-access-k68hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.530623 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-config-data\") pod \"nova-scheduler-0\" (UID: \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.532244 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.546048 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvqvf\" (UniqueName: \"kubernetes.io/projected/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-kube-api-access-zvqvf\") pod \"nova-scheduler-0\" (UID: \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\") " pod="openstack/nova-scheduler-0" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.579152 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-config-data" (OuterVolumeSpecName: "config-data") pod "38935b95-760a-4b84-bb4a-0a89f6b2d1ff" (UID: "38935b95-760a-4b84-bb4a-0a89f6b2d1ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.584793 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38935b95-760a-4b84-bb4a-0a89f6b2d1ff" (UID: "38935b95-760a-4b84-bb4a-0a89f6b2d1ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.593249 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.627320 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.627670 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k68hk\" (UniqueName: \"kubernetes.io/projected/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-kube-api-access-k68hk\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.627687 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:26 crc kubenswrapper[4792]: I0309 09:29:26.627699 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38935b95-760a-4b84-bb4a-0a89f6b2d1ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.193475 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38935b95-760a-4b84-bb4a-0a89f6b2d1ff","Type":"ContainerDied","Data":"51472b9980e1a67b4cda57b44043142efedac98a47d90e928395b0510edb6e42"} Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.193518 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.193853 4792 scope.go:117] "RemoveContainer" containerID="4611c277a21ba9569d4e27268131ce972a02816bb406dc8b74438c2b8f145cb3" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.219382 4792 scope.go:117] "RemoveContainer" containerID="f52ce08b2b3cf4db8261893d951429b67067d659d42866fcc46cdae0021c2e60" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.233841 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.247877 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.258384 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.269094 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:27 crc kubenswrapper[4792]: E0309 09:29:27.269541 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38935b95-760a-4b84-bb4a-0a89f6b2d1ff" containerName="nova-api-api" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.269566 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="38935b95-760a-4b84-bb4a-0a89f6b2d1ff" containerName="nova-api-api" Mar 09 09:29:27 crc kubenswrapper[4792]: E0309 09:29:27.269580 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38935b95-760a-4b84-bb4a-0a89f6b2d1ff" containerName="nova-api-log" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.269589 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="38935b95-760a-4b84-bb4a-0a89f6b2d1ff" containerName="nova-api-log" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.269760 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="38935b95-760a-4b84-bb4a-0a89f6b2d1ff" containerName="nova-api-api" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.269783 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="38935b95-760a-4b84-bb4a-0a89f6b2d1ff" containerName="nova-api-log" Mar 09 09:29:27 crc kubenswrapper[4792]: W0309 09:29:27.269811 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80a8330d_39e6_4b5c_a696_4a1f8e1df1ab.slice/crio-4b627b47d64369c79111c1273048c7a7a04db8cf18db5ed827472650517b8d97 WatchSource:0}: Error finding container 4b627b47d64369c79111c1273048c7a7a04db8cf18db5ed827472650517b8d97: Status 404 returned error can't find the container with id 4b627b47d64369c79111c1273048c7a7a04db8cf18db5ed827472650517b8d97 Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.271839 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.274460 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.285126 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.339860 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43b6c63-0434-4cfa-a525-05e08f7ee87c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.344869 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cggqr\" (UniqueName: \"kubernetes.io/projected/a43b6c63-0434-4cfa-a525-05e08f7ee87c-kube-api-access-cggqr\") pod \"nova-api-0\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.344997 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a43b6c63-0434-4cfa-a525-05e08f7ee87c-logs\") pod \"nova-api-0\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.345105 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43b6c63-0434-4cfa-a525-05e08f7ee87c-config-data\") pod \"nova-api-0\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.448450 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43b6c63-0434-4cfa-a525-05e08f7ee87c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.448686 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cggqr\" (UniqueName: \"kubernetes.io/projected/a43b6c63-0434-4cfa-a525-05e08f7ee87c-kube-api-access-cggqr\") pod \"nova-api-0\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.448760 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a43b6c63-0434-4cfa-a525-05e08f7ee87c-logs\") pod \"nova-api-0\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.448823 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43b6c63-0434-4cfa-a525-05e08f7ee87c-config-data\") pod \"nova-api-0\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.455925 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a43b6c63-0434-4cfa-a525-05e08f7ee87c-logs\") pod \"nova-api-0\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.457292 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43b6c63-0434-4cfa-a525-05e08f7ee87c-config-data\") pod \"nova-api-0\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.463582 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43b6c63-0434-4cfa-a525-05e08f7ee87c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.470047 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cggqr\" (UniqueName: \"kubernetes.io/projected/a43b6c63-0434-4cfa-a525-05e08f7ee87c-kube-api-access-cggqr\") pod \"nova-api-0\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.603635 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.688974 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38935b95-760a-4b84-bb4a-0a89f6b2d1ff" path="/var/lib/kubelet/pods/38935b95-760a-4b84-bb4a-0a89f6b2d1ff/volumes" Mar 09 09:29:27 crc kubenswrapper[4792]: I0309 09:29:27.689690 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392f6124-a359-44d2-b9ba-2176b7a3debd" path="/var/lib/kubelet/pods/392f6124-a359-44d2-b9ba-2176b7a3debd/volumes" Mar 09 09:29:28 crc kubenswrapper[4792]: I0309 09:29:28.198903 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:28 crc kubenswrapper[4792]: I0309 09:29:28.208534 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab","Type":"ContainerStarted","Data":"fab14509b4d8b0e7ea117d2f55a8e109aa2f1491b049626a48b4e562374876d5"} Mar 09 09:29:28 crc kubenswrapper[4792]: I0309 09:29:28.208579 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab","Type":"ContainerStarted","Data":"4b627b47d64369c79111c1273048c7a7a04db8cf18db5ed827472650517b8d97"} Mar 09 09:29:28 crc kubenswrapper[4792]: I0309 09:29:28.234889 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.234868082 podStartE2EDuration="2.234868082s" podCreationTimestamp="2026-03-09 09:29:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:29:28.234590133 +0000 UTC m=+1333.264790885" watchObservedRunningTime="2026-03-09 09:29:28.234868082 +0000 UTC m=+1333.265068834" Mar 09 09:29:29 crc kubenswrapper[4792]: I0309 09:29:29.220211 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d557a1d7-8c39-4058-82d7-f290f1ad60b0","Type":"ContainerStarted","Data":"b758214fded1a9f06ca9570c172b39706aa00cc5b6469d55e551b356ba5b3926"} Mar 09 09:29:29 crc kubenswrapper[4792]: I0309 09:29:29.220917 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:29:29 crc kubenswrapper[4792]: I0309 09:29:29.224348 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a43b6c63-0434-4cfa-a525-05e08f7ee87c","Type":"ContainerStarted","Data":"9a60da10547fda9027675c0c915ecf9b01f47863f07395f0f5c9bcf30ea9e132"} Mar 09 09:29:29 crc kubenswrapper[4792]: I0309 09:29:29.224401 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a43b6c63-0434-4cfa-a525-05e08f7ee87c","Type":"ContainerStarted","Data":"b3dab4b160779dd0942c91e6644a7a0f6cc15b4c084e04bbca0df71c1bda9c72"} Mar 09 09:29:29 crc kubenswrapper[4792]: I0309 09:29:29.224414 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a43b6c63-0434-4cfa-a525-05e08f7ee87c","Type":"ContainerStarted","Data":"5e93a7d55d839bdb35f6947f356a259501fadd7275f801a5d8bd16addcf8c4c1"} Mar 09 09:29:29 crc kubenswrapper[4792]: I0309 09:29:29.242187 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.623885426 podStartE2EDuration="8.242167641s" podCreationTimestamp="2026-03-09 09:29:21 +0000 UTC" firstStartedPulling="2026-03-09 09:29:22.742405945 +0000 UTC m=+1327.772606697" lastFinishedPulling="2026-03-09 09:29:28.36068815 +0000 UTC m=+1333.390888912" observedRunningTime="2026-03-09 09:29:29.240727559 +0000 UTC m=+1334.270928311" watchObservedRunningTime="2026-03-09 09:29:29.242167641 +0000 UTC m=+1334.272368393" Mar 09 09:29:29 crc kubenswrapper[4792]: I0309 09:29:29.273368 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.273344484 podStartE2EDuration="2.273344484s" podCreationTimestamp="2026-03-09 09:29:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:29:29.262296114 +0000 UTC m=+1334.292496866" watchObservedRunningTime="2026-03-09 09:29:29.273344484 +0000 UTC m=+1334.303545236" Mar 09 09:29:31 crc kubenswrapper[4792]: I0309 09:29:31.594297 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 09:29:35 crc kubenswrapper[4792]: I0309 09:29:35.281890 4792 generic.go:334] "Generic (PLEG): container finished" podID="f5f70ffc-88a1-470d-923c-78a1cc6aba7b" containerID="4a974df632b929a8818d4b5a23dffeee8227f916cb8145de0821d944b5d816a5" exitCode=0 Mar 09 09:29:35 crc kubenswrapper[4792]: I0309 09:29:35.282012 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-622mt" event={"ID":"f5f70ffc-88a1-470d-923c-78a1cc6aba7b","Type":"ContainerDied","Data":"4a974df632b929a8818d4b5a23dffeee8227f916cb8145de0821d944b5d816a5"} Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.594158 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.624300 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.682288 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.748313 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-combined-ca-bundle\") pod \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.748384 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-config-data\") pod \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.748476 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-scripts\") pod \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.748549 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjwh8\" (UniqueName: \"kubernetes.io/projected/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-kube-api-access-pjwh8\") pod \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\" (UID: \"f5f70ffc-88a1-470d-923c-78a1cc6aba7b\") " Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.767279 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-kube-api-access-pjwh8" (OuterVolumeSpecName: "kube-api-access-pjwh8") pod "f5f70ffc-88a1-470d-923c-78a1cc6aba7b" (UID: "f5f70ffc-88a1-470d-923c-78a1cc6aba7b"). InnerVolumeSpecName "kube-api-access-pjwh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.771988 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-scripts" (OuterVolumeSpecName: "scripts") pod "f5f70ffc-88a1-470d-923c-78a1cc6aba7b" (UID: "f5f70ffc-88a1-470d-923c-78a1cc6aba7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.789930 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-config-data" (OuterVolumeSpecName: "config-data") pod "f5f70ffc-88a1-470d-923c-78a1cc6aba7b" (UID: "f5f70ffc-88a1-470d-923c-78a1cc6aba7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.816979 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5f70ffc-88a1-470d-923c-78a1cc6aba7b" (UID: "f5f70ffc-88a1-470d-923c-78a1cc6aba7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.851114 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.851158 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.851169 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:36 crc kubenswrapper[4792]: I0309 09:29:36.851180 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjwh8\" (UniqueName: \"kubernetes.io/projected/f5f70ffc-88a1-470d-923c-78a1cc6aba7b-kube-api-access-pjwh8\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.304836 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-622mt" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.304868 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-622mt" event={"ID":"f5f70ffc-88a1-470d-923c-78a1cc6aba7b","Type":"ContainerDied","Data":"9b8be80a68ea4fa8956052f0b35881f4166825555742ed88179d9a55458c3284"} Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.304926 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b8be80a68ea4fa8956052f0b35881f4166825555742ed88179d9a55458c3284" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.355459 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.403344 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 09:29:37 crc kubenswrapper[4792]: E0309 09:29:37.404039 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5f70ffc-88a1-470d-923c-78a1cc6aba7b" containerName="nova-cell1-conductor-db-sync" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.404164 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f70ffc-88a1-470d-923c-78a1cc6aba7b" containerName="nova-cell1-conductor-db-sync" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.404493 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5f70ffc-88a1-470d-923c-78a1cc6aba7b" containerName="nova-cell1-conductor-db-sync" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.405176 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.407859 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.421753 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.461144 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65141e32-9490-4e93-9338-c9878770172e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"65141e32-9490-4e93-9338-c9878770172e\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.461228 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj9rj\" (UniqueName: \"kubernetes.io/projected/65141e32-9490-4e93-9338-c9878770172e-kube-api-access-bj9rj\") pod \"nova-cell1-conductor-0\" (UID: \"65141e32-9490-4e93-9338-c9878770172e\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.461282 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65141e32-9490-4e93-9338-c9878770172e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"65141e32-9490-4e93-9338-c9878770172e\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.562708 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65141e32-9490-4e93-9338-c9878770172e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"65141e32-9490-4e93-9338-c9878770172e\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.562784 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj9rj\" (UniqueName: \"kubernetes.io/projected/65141e32-9490-4e93-9338-c9878770172e-kube-api-access-bj9rj\") pod \"nova-cell1-conductor-0\" (UID: \"65141e32-9490-4e93-9338-c9878770172e\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.562829 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65141e32-9490-4e93-9338-c9878770172e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"65141e32-9490-4e93-9338-c9878770172e\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.577118 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65141e32-9490-4e93-9338-c9878770172e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"65141e32-9490-4e93-9338-c9878770172e\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.579809 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65141e32-9490-4e93-9338-c9878770172e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"65141e32-9490-4e93-9338-c9878770172e\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.592052 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj9rj\" (UniqueName: \"kubernetes.io/projected/65141e32-9490-4e93-9338-c9878770172e-kube-api-access-bj9rj\") pod \"nova-cell1-conductor-0\" (UID: \"65141e32-9490-4e93-9338-c9878770172e\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.604437 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.604497 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:29:37 crc kubenswrapper[4792]: I0309 09:29:37.728664 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 09:29:38 crc kubenswrapper[4792]: I0309 09:29:38.261832 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 09:29:38 crc kubenswrapper[4792]: I0309 09:29:38.321557 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"65141e32-9490-4e93-9338-c9878770172e","Type":"ContainerStarted","Data":"afa6e230cdaac50b4b7f8e1f99706e3226e9ce432a5d240e072d7d0b918b62d8"} Mar 09 09:29:38 crc kubenswrapper[4792]: I0309 09:29:38.646269 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a43b6c63-0434-4cfa-a525-05e08f7ee87c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:29:38 crc kubenswrapper[4792]: I0309 09:29:38.688365 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a43b6c63-0434-4cfa-a525-05e08f7ee87c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:29:39 crc kubenswrapper[4792]: I0309 09:29:39.331538 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"65141e32-9490-4e93-9338-c9878770172e","Type":"ContainerStarted","Data":"253d1b017185cc74e99a85a92155520670fbe2ae7387b47a2be9cdf51dd54c23"} Mar 09 09:29:39 crc kubenswrapper[4792]: I0309 09:29:39.331733 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 09 09:29:39 crc kubenswrapper[4792]: I0309 09:29:39.349152 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.349131256 podStartE2EDuration="2.349131256s" podCreationTimestamp="2026-03-09 09:29:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:29:39.346528051 +0000 UTC m=+1344.376728803" watchObservedRunningTime="2026-03-09 09:29:39.349131256 +0000 UTC m=+1344.379332008" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.218748 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.219253 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.307853 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.372873 4792 generic.go:334] "Generic (PLEG): container finished" podID="16d5eb0a-cb03-4a91-999b-8bbd16a0cf85" containerID="4b4106e70eddba98c76e1a5ea72d5327d16d50543e5162660561fe897e84ded6" exitCode=137 Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.372952 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85","Type":"ContainerDied","Data":"4b4106e70eddba98c76e1a5ea72d5327d16d50543e5162660561fe897e84ded6"} Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.381997 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.382013 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"941a2d27-f778-4bd5-9d47-00a485b24e44","Type":"ContainerDied","Data":"f5d7f734e56fe8acacefd3d006d7c0507336aabf243cfe706500fdbc7950200e"} Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.382412 4792 scope.go:117] "RemoveContainer" containerID="f5d7f734e56fe8acacefd3d006d7c0507336aabf243cfe706500fdbc7950200e" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.382575 4792 generic.go:334] "Generic (PLEG): container finished" podID="941a2d27-f778-4bd5-9d47-00a485b24e44" containerID="f5d7f734e56fe8acacefd3d006d7c0507336aabf243cfe706500fdbc7950200e" exitCode=137 Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.382785 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"941a2d27-f778-4bd5-9d47-00a485b24e44","Type":"ContainerDied","Data":"edb699e6f134ce75639f16e977d4e7796626f75cdb546c0b8df44ec947f613c1"} Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.389655 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941a2d27-f778-4bd5-9d47-00a485b24e44-combined-ca-bundle\") pod \"941a2d27-f778-4bd5-9d47-00a485b24e44\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.389989 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/941a2d27-f778-4bd5-9d47-00a485b24e44-logs\") pod \"941a2d27-f778-4bd5-9d47-00a485b24e44\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.390111 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941a2d27-f778-4bd5-9d47-00a485b24e44-config-data\") pod \"941a2d27-f778-4bd5-9d47-00a485b24e44\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.390391 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltl5g\" (UniqueName: \"kubernetes.io/projected/941a2d27-f778-4bd5-9d47-00a485b24e44-kube-api-access-ltl5g\") pod \"941a2d27-f778-4bd5-9d47-00a485b24e44\" (UID: \"941a2d27-f778-4bd5-9d47-00a485b24e44\") " Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.390536 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/941a2d27-f778-4bd5-9d47-00a485b24e44-logs" (OuterVolumeSpecName: "logs") pod "941a2d27-f778-4bd5-9d47-00a485b24e44" (UID: "941a2d27-f778-4bd5-9d47-00a485b24e44"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.391096 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/941a2d27-f778-4bd5-9d47-00a485b24e44-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.399444 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/941a2d27-f778-4bd5-9d47-00a485b24e44-kube-api-access-ltl5g" (OuterVolumeSpecName: "kube-api-access-ltl5g") pod "941a2d27-f778-4bd5-9d47-00a485b24e44" (UID: "941a2d27-f778-4bd5-9d47-00a485b24e44"). InnerVolumeSpecName "kube-api-access-ltl5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.428942 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941a2d27-f778-4bd5-9d47-00a485b24e44-config-data" (OuterVolumeSpecName: "config-data") pod "941a2d27-f778-4bd5-9d47-00a485b24e44" (UID: "941a2d27-f778-4bd5-9d47-00a485b24e44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.433834 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941a2d27-f778-4bd5-9d47-00a485b24e44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "941a2d27-f778-4bd5-9d47-00a485b24e44" (UID: "941a2d27-f778-4bd5-9d47-00a485b24e44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.489018 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.494115 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltl5g\" (UniqueName: \"kubernetes.io/projected/941a2d27-f778-4bd5-9d47-00a485b24e44-kube-api-access-ltl5g\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.494150 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941a2d27-f778-4bd5-9d47-00a485b24e44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.494163 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941a2d27-f778-4bd5-9d47-00a485b24e44-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.494318 4792 scope.go:117] "RemoveContainer" containerID="ce977c4f40c8af00d93860170dcb0a171b69506e355ae57941711080b0b4d0b5" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.523002 4792 scope.go:117] "RemoveContainer" containerID="f5d7f734e56fe8acacefd3d006d7c0507336aabf243cfe706500fdbc7950200e" Mar 09 09:29:43 crc kubenswrapper[4792]: E0309 09:29:43.525269 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5d7f734e56fe8acacefd3d006d7c0507336aabf243cfe706500fdbc7950200e\": container with ID starting with f5d7f734e56fe8acacefd3d006d7c0507336aabf243cfe706500fdbc7950200e not found: ID does not exist" containerID="f5d7f734e56fe8acacefd3d006d7c0507336aabf243cfe706500fdbc7950200e" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.525316 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5d7f734e56fe8acacefd3d006d7c0507336aabf243cfe706500fdbc7950200e"} err="failed to get container status \"f5d7f734e56fe8acacefd3d006d7c0507336aabf243cfe706500fdbc7950200e\": rpc error: code = NotFound desc = could not find container \"f5d7f734e56fe8acacefd3d006d7c0507336aabf243cfe706500fdbc7950200e\": container with ID starting with f5d7f734e56fe8acacefd3d006d7c0507336aabf243cfe706500fdbc7950200e not found: ID does not exist" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.525344 4792 scope.go:117] "RemoveContainer" containerID="ce977c4f40c8af00d93860170dcb0a171b69506e355ae57941711080b0b4d0b5" Mar 09 09:29:43 crc kubenswrapper[4792]: E0309 09:29:43.525700 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce977c4f40c8af00d93860170dcb0a171b69506e355ae57941711080b0b4d0b5\": container with ID starting with ce977c4f40c8af00d93860170dcb0a171b69506e355ae57941711080b0b4d0b5 not found: ID does not exist" containerID="ce977c4f40c8af00d93860170dcb0a171b69506e355ae57941711080b0b4d0b5" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.525731 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce977c4f40c8af00d93860170dcb0a171b69506e355ae57941711080b0b4d0b5"} err="failed to get container status \"ce977c4f40c8af00d93860170dcb0a171b69506e355ae57941711080b0b4d0b5\": rpc error: code = NotFound desc = could not find container \"ce977c4f40c8af00d93860170dcb0a171b69506e355ae57941711080b0b4d0b5\": container with ID starting with ce977c4f40c8af00d93860170dcb0a171b69506e355ae57941711080b0b4d0b5 not found: ID does not exist" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.596494 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-combined-ca-bundle\") pod \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\" (UID: \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\") " Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.597259 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78nzp\" (UniqueName: \"kubernetes.io/projected/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-kube-api-access-78nzp\") pod \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\" (UID: \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\") " Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.597534 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-config-data\") pod \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\" (UID: \"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85\") " Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.603422 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-kube-api-access-78nzp" (OuterVolumeSpecName: "kube-api-access-78nzp") pod "16d5eb0a-cb03-4a91-999b-8bbd16a0cf85" (UID: "16d5eb0a-cb03-4a91-999b-8bbd16a0cf85"). InnerVolumeSpecName "kube-api-access-78nzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.631173 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16d5eb0a-cb03-4a91-999b-8bbd16a0cf85" (UID: "16d5eb0a-cb03-4a91-999b-8bbd16a0cf85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.632778 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-config-data" (OuterVolumeSpecName: "config-data") pod "16d5eb0a-cb03-4a91-999b-8bbd16a0cf85" (UID: "16d5eb0a-cb03-4a91-999b-8bbd16a0cf85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.701388 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.701528 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78nzp\" (UniqueName: \"kubernetes.io/projected/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-kube-api-access-78nzp\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.701545 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.732448 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.748437 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.761650 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:29:43 crc kubenswrapper[4792]: E0309 09:29:43.762035 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d5eb0a-cb03-4a91-999b-8bbd16a0cf85" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.762048 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d5eb0a-cb03-4a91-999b-8bbd16a0cf85" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 09:29:43 crc kubenswrapper[4792]: E0309 09:29:43.762097 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941a2d27-f778-4bd5-9d47-00a485b24e44" containerName="nova-metadata-log" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.762104 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="941a2d27-f778-4bd5-9d47-00a485b24e44" containerName="nova-metadata-log" Mar 09 09:29:43 crc kubenswrapper[4792]: E0309 09:29:43.762114 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941a2d27-f778-4bd5-9d47-00a485b24e44" containerName="nova-metadata-metadata" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.762120 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="941a2d27-f778-4bd5-9d47-00a485b24e44" containerName="nova-metadata-metadata" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.762283 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d5eb0a-cb03-4a91-999b-8bbd16a0cf85" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.762301 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="941a2d27-f778-4bd5-9d47-00a485b24e44" containerName="nova-metadata-metadata" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.762316 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="941a2d27-f778-4bd5-9d47-00a485b24e44" containerName="nova-metadata-log" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.763177 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.765708 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.766498 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.783468 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.802991 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zldjt\" (UniqueName: \"kubernetes.io/projected/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-kube-api-access-zldjt\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.803172 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.803246 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-config-data\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.803293 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-logs\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.803351 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.905713 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.905850 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zldjt\" (UniqueName: \"kubernetes.io/projected/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-kube-api-access-zldjt\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.905911 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.906006 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-config-data\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.906044 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-logs\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.906470 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-logs\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.910625 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.910680 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-config-data\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.910763 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:43 crc kubenswrapper[4792]: I0309 09:29:43.923097 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zldjt\" (UniqueName: \"kubernetes.io/projected/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-kube-api-access-zldjt\") pod \"nova-metadata-0\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " pod="openstack/nova-metadata-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.090958 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.396469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"16d5eb0a-cb03-4a91-999b-8bbd16a0cf85","Type":"ContainerDied","Data":"c1009e44913c1bd3fd937ea3b2d9e72ad5de37996074f012fb3d5f6663a137af"} Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.396618 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.396834 4792 scope.go:117] "RemoveContainer" containerID="4b4106e70eddba98c76e1a5ea72d5327d16d50543e5162660561fe897e84ded6" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.433751 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.456556 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.473974 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.475250 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.483770 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.484020 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.484190 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.488030 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.522037 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr54c\" (UniqueName: \"kubernetes.io/projected/8adff3b6-586a-445f-adee-c3f412b874c0-kube-api-access-rr54c\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.522131 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8adff3b6-586a-445f-adee-c3f412b874c0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.522274 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8adff3b6-586a-445f-adee-c3f412b874c0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.522314 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8adff3b6-586a-445f-adee-c3f412b874c0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.522338 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8adff3b6-586a-445f-adee-c3f412b874c0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.584175 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:29:44 crc kubenswrapper[4792]: W0309 09:29:44.600336 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7b96e62_a545_4c9b_946b_b7c6a12ceca6.slice/crio-2eb63bf6e97175d5a0553d080ba4afda12fa1d8c826c4fbca9c438cc446fdb22 WatchSource:0}: Error finding container 2eb63bf6e97175d5a0553d080ba4afda12fa1d8c826c4fbca9c438cc446fdb22: Status 404 returned error can't find the container with id 2eb63bf6e97175d5a0553d080ba4afda12fa1d8c826c4fbca9c438cc446fdb22 Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.625253 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8adff3b6-586a-445f-adee-c3f412b874c0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.625324 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8adff3b6-586a-445f-adee-c3f412b874c0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.625354 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8adff3b6-586a-445f-adee-c3f412b874c0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.625402 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr54c\" (UniqueName: \"kubernetes.io/projected/8adff3b6-586a-445f-adee-c3f412b874c0-kube-api-access-rr54c\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.625439 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8adff3b6-586a-445f-adee-c3f412b874c0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.632808 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8adff3b6-586a-445f-adee-c3f412b874c0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.632961 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8adff3b6-586a-445f-adee-c3f412b874c0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.635581 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8adff3b6-586a-445f-adee-c3f412b874c0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.638492 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8adff3b6-586a-445f-adee-c3f412b874c0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.646746 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr54c\" (UniqueName: \"kubernetes.io/projected/8adff3b6-586a-445f-adee-c3f412b874c0-kube-api-access-rr54c\") pod \"nova-cell1-novncproxy-0\" (UID: \"8adff3b6-586a-445f-adee-c3f412b874c0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:44 crc kubenswrapper[4792]: I0309 09:29:44.797603 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:45 crc kubenswrapper[4792]: I0309 09:29:45.269296 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:29:45 crc kubenswrapper[4792]: W0309 09:29:45.290312 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8adff3b6_586a_445f_adee_c3f412b874c0.slice/crio-127879fa7c1aae39fa2b642776e1a381a38b1b21eb17df05e2b38052a762f2ae WatchSource:0}: Error finding container 127879fa7c1aae39fa2b642776e1a381a38b1b21eb17df05e2b38052a762f2ae: Status 404 returned error can't find the container with id 127879fa7c1aae39fa2b642776e1a381a38b1b21eb17df05e2b38052a762f2ae Mar 09 09:29:45 crc kubenswrapper[4792]: I0309 09:29:45.414376 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7b96e62-a545-4c9b-946b-b7c6a12ceca6","Type":"ContainerStarted","Data":"f24f496e8119c37fe02291e1ea7aea6b584c15ab1e6abbc078d529a59d6d3675"} Mar 09 09:29:45 crc kubenswrapper[4792]: I0309 09:29:45.414682 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7b96e62-a545-4c9b-946b-b7c6a12ceca6","Type":"ContainerStarted","Data":"b43fee17507f9dc4486139a74d40156c3f8683fad426c7bbca441336900caebd"} Mar 09 09:29:45 crc kubenswrapper[4792]: I0309 09:29:45.414700 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7b96e62-a545-4c9b-946b-b7c6a12ceca6","Type":"ContainerStarted","Data":"2eb63bf6e97175d5a0553d080ba4afda12fa1d8c826c4fbca9c438cc446fdb22"} Mar 09 09:29:45 crc kubenswrapper[4792]: I0309 09:29:45.420900 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8adff3b6-586a-445f-adee-c3f412b874c0","Type":"ContainerStarted","Data":"127879fa7c1aae39fa2b642776e1a381a38b1b21eb17df05e2b38052a762f2ae"} Mar 09 09:29:45 crc kubenswrapper[4792]: I0309 09:29:45.449973 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.449954064 podStartE2EDuration="2.449954064s" podCreationTimestamp="2026-03-09 09:29:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:29:45.447908514 +0000 UTC m=+1350.478109286" watchObservedRunningTime="2026-03-09 09:29:45.449954064 +0000 UTC m=+1350.480154816" Mar 09 09:29:45 crc kubenswrapper[4792]: I0309 09:29:45.694133 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d5eb0a-cb03-4a91-999b-8bbd16a0cf85" path="/var/lib/kubelet/pods/16d5eb0a-cb03-4a91-999b-8bbd16a0cf85/volumes" Mar 09 09:29:45 crc kubenswrapper[4792]: I0309 09:29:45.694840 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="941a2d27-f778-4bd5-9d47-00a485b24e44" path="/var/lib/kubelet/pods/941a2d27-f778-4bd5-9d47-00a485b24e44/volumes" Mar 09 09:29:46 crc kubenswrapper[4792]: I0309 09:29:46.435995 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8adff3b6-586a-445f-adee-c3f412b874c0","Type":"ContainerStarted","Data":"f15c9e0b27dffdfaece9e8f2c3489a4611615b73afa966f4a194c877e56dcb10"} Mar 09 09:29:46 crc kubenswrapper[4792]: I0309 09:29:46.458298 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.458275572 podStartE2EDuration="2.458275572s" podCreationTimestamp="2026-03-09 09:29:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:29:46.456524962 +0000 UTC m=+1351.486725724" watchObservedRunningTime="2026-03-09 09:29:46.458275572 +0000 UTC m=+1351.488476324" Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.608238 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.608345 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.608782 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.608841 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.612554 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.613980 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.807766 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.859083 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-6shrv"] Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.860913 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.904757 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-6shrv"] Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.996943 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-config\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.997111 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-dns-svc\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.997224 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-ovsdbserver-nb\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.997326 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-ovsdbserver-sb\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:47 crc kubenswrapper[4792]: I0309 09:29:47.997383 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcjq5\" (UniqueName: \"kubernetes.io/projected/33ae6938-dd8e-4224-9522-a9b8638c8c54-kube-api-access-lcjq5\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:48 crc kubenswrapper[4792]: I0309 09:29:48.098662 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-ovsdbserver-sb\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:48 crc kubenswrapper[4792]: I0309 09:29:48.098735 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcjq5\" (UniqueName: \"kubernetes.io/projected/33ae6938-dd8e-4224-9522-a9b8638c8c54-kube-api-access-lcjq5\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:48 crc kubenswrapper[4792]: I0309 09:29:48.098770 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-config\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:48 crc kubenswrapper[4792]: I0309 09:29:48.098839 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-dns-svc\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:48 crc kubenswrapper[4792]: I0309 09:29:48.098919 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-ovsdbserver-nb\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:48 crc kubenswrapper[4792]: I0309 09:29:48.099804 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-ovsdbserver-nb\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:48 crc kubenswrapper[4792]: I0309 09:29:48.100371 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-ovsdbserver-sb\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:48 crc kubenswrapper[4792]: I0309 09:29:48.102224 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-dns-svc\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:48 crc kubenswrapper[4792]: I0309 09:29:48.102766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-config\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:48 crc kubenswrapper[4792]: I0309 09:29:48.151797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcjq5\" (UniqueName: \"kubernetes.io/projected/33ae6938-dd8e-4224-9522-a9b8638c8c54-kube-api-access-lcjq5\") pod \"dnsmasq-dns-6c74598c69-6shrv\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:48 crc kubenswrapper[4792]: I0309 09:29:48.199832 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:48 crc kubenswrapper[4792]: I0309 09:29:48.763660 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-6shrv"] Mar 09 09:29:49 crc kubenswrapper[4792]: I0309 09:29:49.092039 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 09:29:49 crc kubenswrapper[4792]: I0309 09:29:49.092448 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 09:29:49 crc kubenswrapper[4792]: I0309 09:29:49.477096 4792 generic.go:334] "Generic (PLEG): container finished" podID="33ae6938-dd8e-4224-9522-a9b8638c8c54" containerID="ea45b9054f0415dd359b4b57cdaf77611c1469f7bbd9082b8447686a15720167" exitCode=0 Mar 09 09:29:49 crc kubenswrapper[4792]: I0309 09:29:49.478195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-6shrv" event={"ID":"33ae6938-dd8e-4224-9522-a9b8638c8c54","Type":"ContainerDied","Data":"ea45b9054f0415dd359b4b57cdaf77611c1469f7bbd9082b8447686a15720167"} Mar 09 09:29:49 crc kubenswrapper[4792]: I0309 09:29:49.478230 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-6shrv" event={"ID":"33ae6938-dd8e-4224-9522-a9b8638c8c54","Type":"ContainerStarted","Data":"32dc4c2ea711b8a732a84a026bd3993ba22ae1779d4ac43055a69810990c1b31"} Mar 09 09:29:49 crc kubenswrapper[4792]: I0309 09:29:49.799420 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:49 crc kubenswrapper[4792]: I0309 09:29:49.945871 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:29:49 crc kubenswrapper[4792]: I0309 09:29:49.946642 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="ceilometer-central-agent" containerID="cri-o://45122f2fb7a038895fcdb736b2370681efcc74a0dbbe428f1652b521a02b9cf8" gracePeriod=30 Mar 09 09:29:49 crc kubenswrapper[4792]: I0309 09:29:49.946685 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="sg-core" containerID="cri-o://54015b43ca8b4b1e8e9e69282aea4a91ea1b52223fce1b5fe77f15a07d3d07cd" gracePeriod=30 Mar 09 09:29:49 crc kubenswrapper[4792]: I0309 09:29:49.946790 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="proxy-httpd" containerID="cri-o://b758214fded1a9f06ca9570c172b39706aa00cc5b6469d55e551b356ba5b3926" gracePeriod=30 Mar 09 09:29:49 crc kubenswrapper[4792]: I0309 09:29:49.946896 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="ceilometer-notification-agent" containerID="cri-o://e65146499b15666d6b2f32336aef02d3365db5921060e8e322121085ec54bb62" gracePeriod=30 Mar 09 09:29:49 crc kubenswrapper[4792]: I0309 09:29:49.955724 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.182:3000/\": EOF" Mar 09 09:29:50 crc kubenswrapper[4792]: I0309 09:29:50.490632 4792 generic.go:334] "Generic (PLEG): container finished" podID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerID="b758214fded1a9f06ca9570c172b39706aa00cc5b6469d55e551b356ba5b3926" exitCode=0 Mar 09 09:29:50 crc kubenswrapper[4792]: I0309 09:29:50.490883 4792 generic.go:334] "Generic (PLEG): container finished" podID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerID="54015b43ca8b4b1e8e9e69282aea4a91ea1b52223fce1b5fe77f15a07d3d07cd" exitCode=2 Mar 09 09:29:50 crc kubenswrapper[4792]: I0309 09:29:50.490892 4792 generic.go:334] "Generic (PLEG): container finished" podID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerID="45122f2fb7a038895fcdb736b2370681efcc74a0dbbe428f1652b521a02b9cf8" exitCode=0 Mar 09 09:29:50 crc kubenswrapper[4792]: I0309 09:29:50.490724 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d557a1d7-8c39-4058-82d7-f290f1ad60b0","Type":"ContainerDied","Data":"b758214fded1a9f06ca9570c172b39706aa00cc5b6469d55e551b356ba5b3926"} Mar 09 09:29:50 crc kubenswrapper[4792]: I0309 09:29:50.490971 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d557a1d7-8c39-4058-82d7-f290f1ad60b0","Type":"ContainerDied","Data":"54015b43ca8b4b1e8e9e69282aea4a91ea1b52223fce1b5fe77f15a07d3d07cd"} Mar 09 09:29:50 crc kubenswrapper[4792]: I0309 09:29:50.490987 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d557a1d7-8c39-4058-82d7-f290f1ad60b0","Type":"ContainerDied","Data":"45122f2fb7a038895fcdb736b2370681efcc74a0dbbe428f1652b521a02b9cf8"} Mar 09 09:29:50 crc kubenswrapper[4792]: I0309 09:29:50.493496 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-6shrv" event={"ID":"33ae6938-dd8e-4224-9522-a9b8638c8c54","Type":"ContainerStarted","Data":"e42d0097053816c2b84291655850053d2d6ec8c2668daf65eac099034ab2d71f"} Mar 09 09:29:50 crc kubenswrapper[4792]: I0309 09:29:50.493766 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:50 crc kubenswrapper[4792]: I0309 09:29:50.678195 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c74598c69-6shrv" podStartSLOduration=3.6781710690000002 podStartE2EDuration="3.678171069s" podCreationTimestamp="2026-03-09 09:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:29:50.514343729 +0000 UTC m=+1355.544544491" watchObservedRunningTime="2026-03-09 09:29:50.678171069 +0000 UTC m=+1355.708371821" Mar 09 09:29:50 crc kubenswrapper[4792]: I0309 09:29:50.685331 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:50 crc kubenswrapper[4792]: I0309 09:29:50.685604 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a43b6c63-0434-4cfa-a525-05e08f7ee87c" containerName="nova-api-log" containerID="cri-o://b3dab4b160779dd0942c91e6644a7a0f6cc15b4c084e04bbca0df71c1bda9c72" gracePeriod=30 Mar 09 09:29:50 crc kubenswrapper[4792]: I0309 09:29:50.686101 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a43b6c63-0434-4cfa-a525-05e08f7ee87c" containerName="nova-api-api" containerID="cri-o://9a60da10547fda9027675c0c915ecf9b01f47863f07395f0f5c9bcf30ea9e132" gracePeriod=30 Mar 09 09:29:50 crc kubenswrapper[4792]: I0309 09:29:50.944512 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.060033 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-ceilometer-tls-certs\") pod \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.060107 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-scripts\") pod \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.060152 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-combined-ca-bundle\") pod \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.060187 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnbzt\" (UniqueName: \"kubernetes.io/projected/d557a1d7-8c39-4058-82d7-f290f1ad60b0-kube-api-access-rnbzt\") pod \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.060205 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-config-data\") pod \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.060267 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d557a1d7-8c39-4058-82d7-f290f1ad60b0-log-httpd\") pod \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.060430 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d557a1d7-8c39-4058-82d7-f290f1ad60b0-run-httpd\") pod \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.060579 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-sg-core-conf-yaml\") pod \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\" (UID: \"d557a1d7-8c39-4058-82d7-f290f1ad60b0\") " Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.060750 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d557a1d7-8c39-4058-82d7-f290f1ad60b0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d557a1d7-8c39-4058-82d7-f290f1ad60b0" (UID: "d557a1d7-8c39-4058-82d7-f290f1ad60b0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.060878 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d557a1d7-8c39-4058-82d7-f290f1ad60b0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d557a1d7-8c39-4058-82d7-f290f1ad60b0" (UID: "d557a1d7-8c39-4058-82d7-f290f1ad60b0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.061423 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d557a1d7-8c39-4058-82d7-f290f1ad60b0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.061439 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d557a1d7-8c39-4058-82d7-f290f1ad60b0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.072600 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-scripts" (OuterVolumeSpecName: "scripts") pod "d557a1d7-8c39-4058-82d7-f290f1ad60b0" (UID: "d557a1d7-8c39-4058-82d7-f290f1ad60b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.077268 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d557a1d7-8c39-4058-82d7-f290f1ad60b0-kube-api-access-rnbzt" (OuterVolumeSpecName: "kube-api-access-rnbzt") pod "d557a1d7-8c39-4058-82d7-f290f1ad60b0" (UID: "d557a1d7-8c39-4058-82d7-f290f1ad60b0"). InnerVolumeSpecName "kube-api-access-rnbzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.106521 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d557a1d7-8c39-4058-82d7-f290f1ad60b0" (UID: "d557a1d7-8c39-4058-82d7-f290f1ad60b0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.124450 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d557a1d7-8c39-4058-82d7-f290f1ad60b0" (UID: "d557a1d7-8c39-4058-82d7-f290f1ad60b0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.154883 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d557a1d7-8c39-4058-82d7-f290f1ad60b0" (UID: "d557a1d7-8c39-4058-82d7-f290f1ad60b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.163249 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.163289 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.163300 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.163309 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.163318 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnbzt\" (UniqueName: \"kubernetes.io/projected/d557a1d7-8c39-4058-82d7-f290f1ad60b0-kube-api-access-rnbzt\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.182551 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-config-data" (OuterVolumeSpecName: "config-data") pod "d557a1d7-8c39-4058-82d7-f290f1ad60b0" (UID: "d557a1d7-8c39-4058-82d7-f290f1ad60b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.264445 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d557a1d7-8c39-4058-82d7-f290f1ad60b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.505777 4792 generic.go:334] "Generic (PLEG): container finished" podID="a43b6c63-0434-4cfa-a525-05e08f7ee87c" containerID="b3dab4b160779dd0942c91e6644a7a0f6cc15b4c084e04bbca0df71c1bda9c72" exitCode=143 Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.505846 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a43b6c63-0434-4cfa-a525-05e08f7ee87c","Type":"ContainerDied","Data":"b3dab4b160779dd0942c91e6644a7a0f6cc15b4c084e04bbca0df71c1bda9c72"} Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.509387 4792 generic.go:334] "Generic (PLEG): container finished" podID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerID="e65146499b15666d6b2f32336aef02d3365db5921060e8e322121085ec54bb62" exitCode=0 Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.510359 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.512193 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d557a1d7-8c39-4058-82d7-f290f1ad60b0","Type":"ContainerDied","Data":"e65146499b15666d6b2f32336aef02d3365db5921060e8e322121085ec54bb62"} Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.512248 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d557a1d7-8c39-4058-82d7-f290f1ad60b0","Type":"ContainerDied","Data":"91b56dbd7c1c3d2ffcc937846779e454a85ea88f690b5018d99bc40a0df192f4"} Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.512271 4792 scope.go:117] "RemoveContainer" containerID="b758214fded1a9f06ca9570c172b39706aa00cc5b6469d55e551b356ba5b3926" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.537780 4792 scope.go:117] "RemoveContainer" containerID="54015b43ca8b4b1e8e9e69282aea4a91ea1b52223fce1b5fe77f15a07d3d07cd" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.556822 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.559905 4792 scope.go:117] "RemoveContainer" containerID="e65146499b15666d6b2f32336aef02d3365db5921060e8e322121085ec54bb62" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.583378 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.586978 4792 scope.go:117] "RemoveContainer" containerID="45122f2fb7a038895fcdb736b2370681efcc74a0dbbe428f1652b521a02b9cf8" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.590890 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:29:51 crc kubenswrapper[4792]: E0309 09:29:51.591267 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="proxy-httpd" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.591287 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="proxy-httpd" Mar 09 09:29:51 crc kubenswrapper[4792]: E0309 09:29:51.591304 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="ceilometer-notification-agent" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.591311 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="ceilometer-notification-agent" Mar 09 09:29:51 crc kubenswrapper[4792]: E0309 09:29:51.591331 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="sg-core" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.591337 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="sg-core" Mar 09 09:29:51 crc kubenswrapper[4792]: E0309 09:29:51.591347 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="ceilometer-central-agent" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.591353 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="ceilometer-central-agent" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.591509 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="ceilometer-notification-agent" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.591524 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="proxy-httpd" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.591531 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="ceilometer-central-agent" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.591542 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" containerName="sg-core" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.592949 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.598539 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.602038 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.602312 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.610576 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.630744 4792 scope.go:117] "RemoveContainer" containerID="b758214fded1a9f06ca9570c172b39706aa00cc5b6469d55e551b356ba5b3926" Mar 09 09:29:51 crc kubenswrapper[4792]: E0309 09:29:51.641172 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b758214fded1a9f06ca9570c172b39706aa00cc5b6469d55e551b356ba5b3926\": container with ID starting with b758214fded1a9f06ca9570c172b39706aa00cc5b6469d55e551b356ba5b3926 not found: ID does not exist" containerID="b758214fded1a9f06ca9570c172b39706aa00cc5b6469d55e551b356ba5b3926" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.641216 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b758214fded1a9f06ca9570c172b39706aa00cc5b6469d55e551b356ba5b3926"} err="failed to get container status \"b758214fded1a9f06ca9570c172b39706aa00cc5b6469d55e551b356ba5b3926\": rpc error: code = NotFound desc = could not find container \"b758214fded1a9f06ca9570c172b39706aa00cc5b6469d55e551b356ba5b3926\": container with ID starting with b758214fded1a9f06ca9570c172b39706aa00cc5b6469d55e551b356ba5b3926 not found: ID does not exist" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.641242 4792 scope.go:117] "RemoveContainer" containerID="54015b43ca8b4b1e8e9e69282aea4a91ea1b52223fce1b5fe77f15a07d3d07cd" Mar 09 09:29:51 crc kubenswrapper[4792]: E0309 09:29:51.641607 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54015b43ca8b4b1e8e9e69282aea4a91ea1b52223fce1b5fe77f15a07d3d07cd\": container with ID starting with 54015b43ca8b4b1e8e9e69282aea4a91ea1b52223fce1b5fe77f15a07d3d07cd not found: ID does not exist" containerID="54015b43ca8b4b1e8e9e69282aea4a91ea1b52223fce1b5fe77f15a07d3d07cd" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.641647 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54015b43ca8b4b1e8e9e69282aea4a91ea1b52223fce1b5fe77f15a07d3d07cd"} err="failed to get container status \"54015b43ca8b4b1e8e9e69282aea4a91ea1b52223fce1b5fe77f15a07d3d07cd\": rpc error: code = NotFound desc = could not find container \"54015b43ca8b4b1e8e9e69282aea4a91ea1b52223fce1b5fe77f15a07d3d07cd\": container with ID starting with 54015b43ca8b4b1e8e9e69282aea4a91ea1b52223fce1b5fe77f15a07d3d07cd not found: ID does not exist" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.641678 4792 scope.go:117] "RemoveContainer" containerID="e65146499b15666d6b2f32336aef02d3365db5921060e8e322121085ec54bb62" Mar 09 09:29:51 crc kubenswrapper[4792]: E0309 09:29:51.642298 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e65146499b15666d6b2f32336aef02d3365db5921060e8e322121085ec54bb62\": container with ID starting with e65146499b15666d6b2f32336aef02d3365db5921060e8e322121085ec54bb62 not found: ID does not exist" containerID="e65146499b15666d6b2f32336aef02d3365db5921060e8e322121085ec54bb62" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.642333 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e65146499b15666d6b2f32336aef02d3365db5921060e8e322121085ec54bb62"} err="failed to get container status \"e65146499b15666d6b2f32336aef02d3365db5921060e8e322121085ec54bb62\": rpc error: code = NotFound desc = could not find container \"e65146499b15666d6b2f32336aef02d3365db5921060e8e322121085ec54bb62\": container with ID starting with e65146499b15666d6b2f32336aef02d3365db5921060e8e322121085ec54bb62 not found: ID does not exist" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.642351 4792 scope.go:117] "RemoveContainer" containerID="45122f2fb7a038895fcdb736b2370681efcc74a0dbbe428f1652b521a02b9cf8" Mar 09 09:29:51 crc kubenswrapper[4792]: E0309 09:29:51.642894 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45122f2fb7a038895fcdb736b2370681efcc74a0dbbe428f1652b521a02b9cf8\": container with ID starting with 45122f2fb7a038895fcdb736b2370681efcc74a0dbbe428f1652b521a02b9cf8 not found: ID does not exist" containerID="45122f2fb7a038895fcdb736b2370681efcc74a0dbbe428f1652b521a02b9cf8" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.642916 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45122f2fb7a038895fcdb736b2370681efcc74a0dbbe428f1652b521a02b9cf8"} err="failed to get container status \"45122f2fb7a038895fcdb736b2370681efcc74a0dbbe428f1652b521a02b9cf8\": rpc error: code = NotFound desc = could not find container \"45122f2fb7a038895fcdb736b2370681efcc74a0dbbe428f1652b521a02b9cf8\": container with ID starting with 45122f2fb7a038895fcdb736b2370681efcc74a0dbbe428f1652b521a02b9cf8 not found: ID does not exist" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.679330 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.679425 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wvr8\" (UniqueName: \"kubernetes.io/projected/e842d045-4a00-445d-a41d-7f0b3677b6c5-kube-api-access-2wvr8\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.679469 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e842d045-4a00-445d-a41d-7f0b3677b6c5-log-httpd\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.679586 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.679623 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-scripts\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.679733 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e842d045-4a00-445d-a41d-7f0b3677b6c5-run-httpd\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.679785 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.679856 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-config-data\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.690501 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d557a1d7-8c39-4058-82d7-f290f1ad60b0" path="/var/lib/kubelet/pods/d557a1d7-8c39-4058-82d7-f290f1ad60b0/volumes" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.781908 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e842d045-4a00-445d-a41d-7f0b3677b6c5-run-httpd\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.781964 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.782015 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-config-data\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.782089 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.782120 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wvr8\" (UniqueName: \"kubernetes.io/projected/e842d045-4a00-445d-a41d-7f0b3677b6c5-kube-api-access-2wvr8\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.782151 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e842d045-4a00-445d-a41d-7f0b3677b6c5-log-httpd\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.782212 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.782231 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-scripts\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.783295 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e842d045-4a00-445d-a41d-7f0b3677b6c5-run-httpd\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.783680 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e842d045-4a00-445d-a41d-7f0b3677b6c5-log-httpd\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.788843 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-scripts\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.789251 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.789758 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.798225 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.803221 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wvr8\" (UniqueName: \"kubernetes.io/projected/e842d045-4a00-445d-a41d-7f0b3677b6c5-kube-api-access-2wvr8\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.803589 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-config-data\") pod \"ceilometer-0\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " pod="openstack/ceilometer-0" Mar 09 09:29:51 crc kubenswrapper[4792]: I0309 09:29:51.936857 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:29:52 crc kubenswrapper[4792]: I0309 09:29:52.425697 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:29:52 crc kubenswrapper[4792]: W0309 09:29:52.434743 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode842d045_4a00_445d_a41d_7f0b3677b6c5.slice/crio-4311c458f67108d32b0d3523ed6bc0b6203e3c71f44468aaae5adeb5efc47a79 WatchSource:0}: Error finding container 4311c458f67108d32b0d3523ed6bc0b6203e3c71f44468aaae5adeb5efc47a79: Status 404 returned error can't find the container with id 4311c458f67108d32b0d3523ed6bc0b6203e3c71f44468aaae5adeb5efc47a79 Mar 09 09:29:52 crc kubenswrapper[4792]: I0309 09:29:52.525445 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e842d045-4a00-445d-a41d-7f0b3677b6c5","Type":"ContainerStarted","Data":"4311c458f67108d32b0d3523ed6bc0b6203e3c71f44468aaae5adeb5efc47a79"} Mar 09 09:29:52 crc kubenswrapper[4792]: I0309 09:29:52.722814 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:29:53 crc kubenswrapper[4792]: I0309 09:29:53.533810 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e842d045-4a00-445d-a41d-7f0b3677b6c5","Type":"ContainerStarted","Data":"00fce43351ae1ae9c264222af34886f136ff1ce1456c169e0309cf91d9c1f480"} Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.091584 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.091867 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.256018 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.328776 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cggqr\" (UniqueName: \"kubernetes.io/projected/a43b6c63-0434-4cfa-a525-05e08f7ee87c-kube-api-access-cggqr\") pod \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.328864 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43b6c63-0434-4cfa-a525-05e08f7ee87c-config-data\") pod \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.328951 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43b6c63-0434-4cfa-a525-05e08f7ee87c-combined-ca-bundle\") pod \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.329014 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a43b6c63-0434-4cfa-a525-05e08f7ee87c-logs\") pod \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\" (UID: \"a43b6c63-0434-4cfa-a525-05e08f7ee87c\") " Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.329922 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a43b6c63-0434-4cfa-a525-05e08f7ee87c-logs" (OuterVolumeSpecName: "logs") pod "a43b6c63-0434-4cfa-a525-05e08f7ee87c" (UID: "a43b6c63-0434-4cfa-a525-05e08f7ee87c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.354739 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a43b6c63-0434-4cfa-a525-05e08f7ee87c-kube-api-access-cggqr" (OuterVolumeSpecName: "kube-api-access-cggqr") pod "a43b6c63-0434-4cfa-a525-05e08f7ee87c" (UID: "a43b6c63-0434-4cfa-a525-05e08f7ee87c"). InnerVolumeSpecName "kube-api-access-cggqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.387318 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a43b6c63-0434-4cfa-a525-05e08f7ee87c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a43b6c63-0434-4cfa-a525-05e08f7ee87c" (UID: "a43b6c63-0434-4cfa-a525-05e08f7ee87c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.388943 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a43b6c63-0434-4cfa-a525-05e08f7ee87c-config-data" (OuterVolumeSpecName: "config-data") pod "a43b6c63-0434-4cfa-a525-05e08f7ee87c" (UID: "a43b6c63-0434-4cfa-a525-05e08f7ee87c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.431347 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cggqr\" (UniqueName: \"kubernetes.io/projected/a43b6c63-0434-4cfa-a525-05e08f7ee87c-kube-api-access-cggqr\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.431391 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43b6c63-0434-4cfa-a525-05e08f7ee87c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.431407 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43b6c63-0434-4cfa-a525-05e08f7ee87c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.431418 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a43b6c63-0434-4cfa-a525-05e08f7ee87c-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.553605 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e842d045-4a00-445d-a41d-7f0b3677b6c5","Type":"ContainerStarted","Data":"8bb967386b76c873886b3e36968d9a80bf24314761c470a72d43454be3de521c"} Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.569906 4792 generic.go:334] "Generic (PLEG): container finished" podID="a43b6c63-0434-4cfa-a525-05e08f7ee87c" containerID="9a60da10547fda9027675c0c915ecf9b01f47863f07395f0f5c9bcf30ea9e132" exitCode=0 Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.569943 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a43b6c63-0434-4cfa-a525-05e08f7ee87c","Type":"ContainerDied","Data":"9a60da10547fda9027675c0c915ecf9b01f47863f07395f0f5c9bcf30ea9e132"} Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.569968 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a43b6c63-0434-4cfa-a525-05e08f7ee87c","Type":"ContainerDied","Data":"5e93a7d55d839bdb35f6947f356a259501fadd7275f801a5d8bd16addcf8c4c1"} Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.569983 4792 scope.go:117] "RemoveContainer" containerID="9a60da10547fda9027675c0c915ecf9b01f47863f07395f0f5c9bcf30ea9e132" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.570148 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.622297 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.643096 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.653574 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:54 crc kubenswrapper[4792]: E0309 09:29:54.653925 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a43b6c63-0434-4cfa-a525-05e08f7ee87c" containerName="nova-api-log" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.653942 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43b6c63-0434-4cfa-a525-05e08f7ee87c" containerName="nova-api-log" Mar 09 09:29:54 crc kubenswrapper[4792]: E0309 09:29:54.653973 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a43b6c63-0434-4cfa-a525-05e08f7ee87c" containerName="nova-api-api" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.653980 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43b6c63-0434-4cfa-a525-05e08f7ee87c" containerName="nova-api-api" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.654161 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a43b6c63-0434-4cfa-a525-05e08f7ee87c" containerName="nova-api-api" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.654190 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a43b6c63-0434-4cfa-a525-05e08f7ee87c" containerName="nova-api-log" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.655270 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.659867 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.660231 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.662557 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.663788 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.708247 4792 scope.go:117] "RemoveContainer" containerID="b3dab4b160779dd0942c91e6644a7a0f6cc15b4c084e04bbca0df71c1bda9c72" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.736933 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.736999 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-public-tls-certs\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.737081 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-config-data\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.741620 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp846\" (UniqueName: \"kubernetes.io/projected/de065134-a827-48ff-8694-948775349bd8-kube-api-access-pp846\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.741681 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.741805 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de065134-a827-48ff-8694-948775349bd8-logs\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.770468 4792 scope.go:117] "RemoveContainer" containerID="9a60da10547fda9027675c0c915ecf9b01f47863f07395f0f5c9bcf30ea9e132" Mar 09 09:29:54 crc kubenswrapper[4792]: E0309 09:29:54.774326 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a60da10547fda9027675c0c915ecf9b01f47863f07395f0f5c9bcf30ea9e132\": container with ID starting with 9a60da10547fda9027675c0c915ecf9b01f47863f07395f0f5c9bcf30ea9e132 not found: ID does not exist" containerID="9a60da10547fda9027675c0c915ecf9b01f47863f07395f0f5c9bcf30ea9e132" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.774386 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a60da10547fda9027675c0c915ecf9b01f47863f07395f0f5c9bcf30ea9e132"} err="failed to get container status \"9a60da10547fda9027675c0c915ecf9b01f47863f07395f0f5c9bcf30ea9e132\": rpc error: code = NotFound desc = could not find container \"9a60da10547fda9027675c0c915ecf9b01f47863f07395f0f5c9bcf30ea9e132\": container with ID starting with 9a60da10547fda9027675c0c915ecf9b01f47863f07395f0f5c9bcf30ea9e132 not found: ID does not exist" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.774411 4792 scope.go:117] "RemoveContainer" containerID="b3dab4b160779dd0942c91e6644a7a0f6cc15b4c084e04bbca0df71c1bda9c72" Mar 09 09:29:54 crc kubenswrapper[4792]: E0309 09:29:54.777214 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3dab4b160779dd0942c91e6644a7a0f6cc15b4c084e04bbca0df71c1bda9c72\": container with ID starting with b3dab4b160779dd0942c91e6644a7a0f6cc15b4c084e04bbca0df71c1bda9c72 not found: ID does not exist" containerID="b3dab4b160779dd0942c91e6644a7a0f6cc15b4c084e04bbca0df71c1bda9c72" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.777260 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3dab4b160779dd0942c91e6644a7a0f6cc15b4c084e04bbca0df71c1bda9c72"} err="failed to get container status \"b3dab4b160779dd0942c91e6644a7a0f6cc15b4c084e04bbca0df71c1bda9c72\": rpc error: code = NotFound desc = could not find container \"b3dab4b160779dd0942c91e6644a7a0f6cc15b4c084e04bbca0df71c1bda9c72\": container with ID starting with b3dab4b160779dd0942c91e6644a7a0f6cc15b4c084e04bbca0df71c1bda9c72 not found: ID does not exist" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.799715 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.828160 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.843000 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp846\" (UniqueName: \"kubernetes.io/projected/de065134-a827-48ff-8694-948775349bd8-kube-api-access-pp846\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.843387 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.843501 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de065134-a827-48ff-8694-948775349bd8-logs\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.843666 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.844623 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-public-tls-certs\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.843991 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de065134-a827-48ff-8694-948775349bd8-logs\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.845505 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-config-data\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.852847 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.860703 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-public-tls-certs\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.861662 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.864121 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-config-data\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.882705 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp846\" (UniqueName: \"kubernetes.io/projected/de065134-a827-48ff-8694-948775349bd8-kube-api-access-pp846\") pod \"nova-api-0\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " pod="openstack/nova-api-0" Mar 09 09:29:54 crc kubenswrapper[4792]: I0309 09:29:54.991095 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:29:55 crc kubenswrapper[4792]: I0309 09:29:55.108368 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:29:55 crc kubenswrapper[4792]: I0309 09:29:55.108610 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:29:55 crc kubenswrapper[4792]: I0309 09:29:55.588276 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e842d045-4a00-445d-a41d-7f0b3677b6c5","Type":"ContainerStarted","Data":"10021775b2e3e8194b8c36d5fe8a1f67fb053be88f7375784301ddd1d8fdba6d"} Mar 09 09:29:55 crc kubenswrapper[4792]: I0309 09:29:55.591676 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:29:55 crc kubenswrapper[4792]: I0309 09:29:55.676691 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:29:55 crc kubenswrapper[4792]: I0309 09:29:55.691796 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a43b6c63-0434-4cfa-a525-05e08f7ee87c" path="/var/lib/kubelet/pods/a43b6c63-0434-4cfa-a525-05e08f7ee87c/volumes" Mar 09 09:29:55 crc kubenswrapper[4792]: I0309 09:29:55.978278 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dn7pc"] Mar 09 09:29:55 crc kubenswrapper[4792]: I0309 09:29:55.979318 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:55 crc kubenswrapper[4792]: I0309 09:29:55.983763 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 09 09:29:55 crc kubenswrapper[4792]: I0309 09:29:55.983986 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.019417 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dn7pc"] Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.081043 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dn7pc\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.081113 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-scripts\") pod \"nova-cell1-cell-mapping-dn7pc\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.081156 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trcq5\" (UniqueName: \"kubernetes.io/projected/f1399ead-26d7-4512-bbf0-f004f5b95c70-kube-api-access-trcq5\") pod \"nova-cell1-cell-mapping-dn7pc\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.081287 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-config-data\") pod \"nova-cell1-cell-mapping-dn7pc\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.182451 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-config-data\") pod \"nova-cell1-cell-mapping-dn7pc\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.182580 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dn7pc\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.182607 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-scripts\") pod \"nova-cell1-cell-mapping-dn7pc\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.182643 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trcq5\" (UniqueName: \"kubernetes.io/projected/f1399ead-26d7-4512-bbf0-f004f5b95c70-kube-api-access-trcq5\") pod \"nova-cell1-cell-mapping-dn7pc\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.186776 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-scripts\") pod \"nova-cell1-cell-mapping-dn7pc\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.187304 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dn7pc\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.191671 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-config-data\") pod \"nova-cell1-cell-mapping-dn7pc\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.226585 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trcq5\" (UniqueName: \"kubernetes.io/projected/f1399ead-26d7-4512-bbf0-f004f5b95c70-kube-api-access-trcq5\") pod \"nova-cell1-cell-mapping-dn7pc\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.346450 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.699682 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de065134-a827-48ff-8694-948775349bd8","Type":"ContainerStarted","Data":"a65a1654defca46995034eef4dd636fc694cd512c099f28c7ce1b1e420c1ad83"} Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.699976 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de065134-a827-48ff-8694-948775349bd8","Type":"ContainerStarted","Data":"b373b9754cab784bdd41012fbdc1681701f43474ce2de22b9d6101ce2a9be714"} Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.699992 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de065134-a827-48ff-8694-948775349bd8","Type":"ContainerStarted","Data":"a19b6bb6f282df652224b846b7a7ea496ab71e9624d2aeffb78e6b4b1fe97da7"} Mar 09 09:29:56 crc kubenswrapper[4792]: I0309 09:29:56.748624 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.748601773 podStartE2EDuration="2.748601773s" podCreationTimestamp="2026-03-09 09:29:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:29:56.736807413 +0000 UTC m=+1361.767008165" watchObservedRunningTime="2026-03-09 09:29:56.748601773 +0000 UTC m=+1361.778802525" Mar 09 09:29:57 crc kubenswrapper[4792]: I0309 09:29:57.004893 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dn7pc"] Mar 09 09:29:57 crc kubenswrapper[4792]: W0309 09:29:57.024308 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1399ead_26d7_4512_bbf0_f004f5b95c70.slice/crio-676fa51b01426d47e714dbc31c0d7bfb4d2383e68a42ed73e215263ead1eff49 WatchSource:0}: Error finding container 676fa51b01426d47e714dbc31c0d7bfb4d2383e68a42ed73e215263ead1eff49: Status 404 returned error can't find the container with id 676fa51b01426d47e714dbc31c0d7bfb4d2383e68a42ed73e215263ead1eff49 Mar 09 09:29:57 crc kubenswrapper[4792]: I0309 09:29:57.772613 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dn7pc" event={"ID":"f1399ead-26d7-4512-bbf0-f004f5b95c70","Type":"ContainerStarted","Data":"5b97c962d6c8dfe919ecfc90faca3416e3dd5be66b006567cd5445fa68fc9ed2"} Mar 09 09:29:57 crc kubenswrapper[4792]: I0309 09:29:57.778322 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dn7pc" event={"ID":"f1399ead-26d7-4512-bbf0-f004f5b95c70","Type":"ContainerStarted","Data":"676fa51b01426d47e714dbc31c0d7bfb4d2383e68a42ed73e215263ead1eff49"} Mar 09 09:29:57 crc kubenswrapper[4792]: I0309 09:29:57.778357 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e842d045-4a00-445d-a41d-7f0b3677b6c5","Type":"ContainerStarted","Data":"160871ec463dd4d80a7778db65efa1677bb77c88796655b2e1ab5f78918bc0ef"} Mar 09 09:29:57 crc kubenswrapper[4792]: I0309 09:29:57.778421 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="sg-core" containerID="cri-o://10021775b2e3e8194b8c36d5fe8a1f67fb053be88f7375784301ddd1d8fdba6d" gracePeriod=30 Mar 09 09:29:57 crc kubenswrapper[4792]: I0309 09:29:57.778399 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="ceilometer-central-agent" containerID="cri-o://00fce43351ae1ae9c264222af34886f136ff1ce1456c169e0309cf91d9c1f480" gracePeriod=30 Mar 09 09:29:57 crc kubenswrapper[4792]: I0309 09:29:57.778504 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="proxy-httpd" containerID="cri-o://160871ec463dd4d80a7778db65efa1677bb77c88796655b2e1ab5f78918bc0ef" gracePeriod=30 Mar 09 09:29:57 crc kubenswrapper[4792]: I0309 09:29:57.778554 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="ceilometer-notification-agent" containerID="cri-o://8bb967386b76c873886b3e36968d9a80bf24314761c470a72d43454be3de521c" gracePeriod=30 Mar 09 09:29:57 crc kubenswrapper[4792]: I0309 09:29:57.807295 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dn7pc" podStartSLOduration=2.807269888 podStartE2EDuration="2.807269888s" podCreationTimestamp="2026-03-09 09:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:29:57.805538176 +0000 UTC m=+1362.835738938" watchObservedRunningTime="2026-03-09 09:29:57.807269888 +0000 UTC m=+1362.837470640" Mar 09 09:29:57 crc kubenswrapper[4792]: I0309 09:29:57.841775 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.809272619 podStartE2EDuration="6.841742402s" podCreationTimestamp="2026-03-09 09:29:51 +0000 UTC" firstStartedPulling="2026-03-09 09:29:52.453656324 +0000 UTC m=+1357.483857076" lastFinishedPulling="2026-03-09 09:29:56.486126117 +0000 UTC m=+1361.516326859" observedRunningTime="2026-03-09 09:29:57.831503168 +0000 UTC m=+1362.861703920" watchObservedRunningTime="2026-03-09 09:29:57.841742402 +0000 UTC m=+1362.871943154" Mar 09 09:29:58 crc kubenswrapper[4792]: I0309 09:29:58.201372 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:29:58 crc kubenswrapper[4792]: I0309 09:29:58.275439 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-skh6c"] Mar 09 09:29:58 crc kubenswrapper[4792]: I0309 09:29:58.275646 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" podUID="ede6e819-d6bc-48e7-9fcd-4534b432fab5" containerName="dnsmasq-dns" containerID="cri-o://649196a299eda67a28ce5035dc76bf1d4c60efcaec0eb8885c04a6cf9624ace6" gracePeriod=10 Mar 09 09:29:58 crc kubenswrapper[4792]: I0309 09:29:58.791264 4792 generic.go:334] "Generic (PLEG): container finished" podID="ede6e819-d6bc-48e7-9fcd-4534b432fab5" containerID="649196a299eda67a28ce5035dc76bf1d4c60efcaec0eb8885c04a6cf9624ace6" exitCode=0 Mar 09 09:29:58 crc kubenswrapper[4792]: I0309 09:29:58.791638 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" event={"ID":"ede6e819-d6bc-48e7-9fcd-4534b432fab5","Type":"ContainerDied","Data":"649196a299eda67a28ce5035dc76bf1d4c60efcaec0eb8885c04a6cf9624ace6"} Mar 09 09:29:58 crc kubenswrapper[4792]: I0309 09:29:58.810693 4792 generic.go:334] "Generic (PLEG): container finished" podID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerID="160871ec463dd4d80a7778db65efa1677bb77c88796655b2e1ab5f78918bc0ef" exitCode=0 Mar 09 09:29:58 crc kubenswrapper[4792]: I0309 09:29:58.810736 4792 generic.go:334] "Generic (PLEG): container finished" podID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerID="10021775b2e3e8194b8c36d5fe8a1f67fb053be88f7375784301ddd1d8fdba6d" exitCode=2 Mar 09 09:29:58 crc kubenswrapper[4792]: I0309 09:29:58.810745 4792 generic.go:334] "Generic (PLEG): container finished" podID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerID="8bb967386b76c873886b3e36968d9a80bf24314761c470a72d43454be3de521c" exitCode=0 Mar 09 09:29:58 crc kubenswrapper[4792]: I0309 09:29:58.812639 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e842d045-4a00-445d-a41d-7f0b3677b6c5","Type":"ContainerDied","Data":"160871ec463dd4d80a7778db65efa1677bb77c88796655b2e1ab5f78918bc0ef"} Mar 09 09:29:58 crc kubenswrapper[4792]: I0309 09:29:58.812699 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e842d045-4a00-445d-a41d-7f0b3677b6c5","Type":"ContainerDied","Data":"10021775b2e3e8194b8c36d5fe8a1f67fb053be88f7375784301ddd1d8fdba6d"} Mar 09 09:29:58 crc kubenswrapper[4792]: I0309 09:29:58.812714 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e842d045-4a00-445d-a41d-7f0b3677b6c5","Type":"ContainerDied","Data":"8bb967386b76c873886b3e36968d9a80bf24314761c470a72d43454be3de521c"} Mar 09 09:29:58 crc kubenswrapper[4792]: I0309 09:29:58.877966 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.048970 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kw88\" (UniqueName: \"kubernetes.io/projected/ede6e819-d6bc-48e7-9fcd-4534b432fab5-kube-api-access-4kw88\") pod \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.049180 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-ovsdbserver-nb\") pod \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.049460 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-config\") pod \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.049497 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-dns-svc\") pod \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.049562 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-ovsdbserver-sb\") pod \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\" (UID: \"ede6e819-d6bc-48e7-9fcd-4534b432fab5\") " Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.071111 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ede6e819-d6bc-48e7-9fcd-4534b432fab5-kube-api-access-4kw88" (OuterVolumeSpecName: "kube-api-access-4kw88") pod "ede6e819-d6bc-48e7-9fcd-4534b432fab5" (UID: "ede6e819-d6bc-48e7-9fcd-4534b432fab5"). InnerVolumeSpecName "kube-api-access-4kw88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.128632 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-config" (OuterVolumeSpecName: "config") pod "ede6e819-d6bc-48e7-9fcd-4534b432fab5" (UID: "ede6e819-d6bc-48e7-9fcd-4534b432fab5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.128754 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ede6e819-d6bc-48e7-9fcd-4534b432fab5" (UID: "ede6e819-d6bc-48e7-9fcd-4534b432fab5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.144191 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ede6e819-d6bc-48e7-9fcd-4534b432fab5" (UID: "ede6e819-d6bc-48e7-9fcd-4534b432fab5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.150455 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ede6e819-d6bc-48e7-9fcd-4534b432fab5" (UID: "ede6e819-d6bc-48e7-9fcd-4534b432fab5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.151826 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kw88\" (UniqueName: \"kubernetes.io/projected/ede6e819-d6bc-48e7-9fcd-4534b432fab5-kube-api-access-4kw88\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.151857 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.151871 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.151883 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.151893 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ede6e819-d6bc-48e7-9fcd-4534b432fab5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.820135 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" event={"ID":"ede6e819-d6bc-48e7-9fcd-4534b432fab5","Type":"ContainerDied","Data":"c0aef1412990186e6d818b251f7451479cb51d77cb487ba68bd5b9619dbdfe0c"} Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.820186 4792 scope.go:117] "RemoveContainer" containerID="649196a299eda67a28ce5035dc76bf1d4c60efcaec0eb8885c04a6cf9624ace6" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.820342 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5b4cd7c-skh6c" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.851554 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-skh6c"] Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.866040 4792 scope.go:117] "RemoveContainer" containerID="a21d578029b3d8d71f88d0a552ead8993fb9148411576079d13b34d0ac8cef46" Mar 09 09:29:59 crc kubenswrapper[4792]: I0309 09:29:59.884304 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5b4cd7c-skh6c"] Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.134337 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550810-8zk6b"] Mar 09 09:30:00 crc kubenswrapper[4792]: E0309 09:30:00.134711 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ede6e819-d6bc-48e7-9fcd-4534b432fab5" containerName="init" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.134724 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ede6e819-d6bc-48e7-9fcd-4534b432fab5" containerName="init" Mar 09 09:30:00 crc kubenswrapper[4792]: E0309 09:30:00.134738 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ede6e819-d6bc-48e7-9fcd-4534b432fab5" containerName="dnsmasq-dns" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.134744 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ede6e819-d6bc-48e7-9fcd-4534b432fab5" containerName="dnsmasq-dns" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.134920 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ede6e819-d6bc-48e7-9fcd-4534b432fab5" containerName="dnsmasq-dns" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.135631 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550810-8zk6b" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.145166 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.145429 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.145610 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.157740 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550810-8zk6b"] Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.254027 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75"] Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.255478 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.261301 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.265152 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.269508 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75"] Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.274432 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks65l\" (UniqueName: \"kubernetes.io/projected/ac96eb08-404b-4da1-aecd-ae86494f9165-kube-api-access-ks65l\") pod \"auto-csr-approver-29550810-8zk6b\" (UID: \"ac96eb08-404b-4da1-aecd-ae86494f9165\") " pod="openshift-infra/auto-csr-approver-29550810-8zk6b" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.376724 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks65l\" (UniqueName: \"kubernetes.io/projected/ac96eb08-404b-4da1-aecd-ae86494f9165-kube-api-access-ks65l\") pod \"auto-csr-approver-29550810-8zk6b\" (UID: \"ac96eb08-404b-4da1-aecd-ae86494f9165\") " pod="openshift-infra/auto-csr-approver-29550810-8zk6b" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.376780 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8465r\" (UniqueName: \"kubernetes.io/projected/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-kube-api-access-8465r\") pod \"collect-profiles-29550810-2mq75\" (UID: \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.376810 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-config-volume\") pod \"collect-profiles-29550810-2mq75\" (UID: \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.376950 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-secret-volume\") pod \"collect-profiles-29550810-2mq75\" (UID: \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.395210 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks65l\" (UniqueName: \"kubernetes.io/projected/ac96eb08-404b-4da1-aecd-ae86494f9165-kube-api-access-ks65l\") pod \"auto-csr-approver-29550810-8zk6b\" (UID: \"ac96eb08-404b-4da1-aecd-ae86494f9165\") " pod="openshift-infra/auto-csr-approver-29550810-8zk6b" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.454231 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550810-8zk6b" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.479652 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8465r\" (UniqueName: \"kubernetes.io/projected/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-kube-api-access-8465r\") pod \"collect-profiles-29550810-2mq75\" (UID: \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.479730 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-config-volume\") pod \"collect-profiles-29550810-2mq75\" (UID: \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.481217 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-config-volume\") pod \"collect-profiles-29550810-2mq75\" (UID: \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.481482 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-secret-volume\") pod \"collect-profiles-29550810-2mq75\" (UID: \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.489605 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-secret-volume\") pod \"collect-profiles-29550810-2mq75\" (UID: \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.500937 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8465r\" (UniqueName: \"kubernetes.io/projected/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-kube-api-access-8465r\") pod \"collect-profiles-29550810-2mq75\" (UID: \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.575548 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.834172 4792 generic.go:334] "Generic (PLEG): container finished" podID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerID="00fce43351ae1ae9c264222af34886f136ff1ce1456c169e0309cf91d9c1f480" exitCode=0 Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.834222 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e842d045-4a00-445d-a41d-7f0b3677b6c5","Type":"ContainerDied","Data":"00fce43351ae1ae9c264222af34886f136ff1ce1456c169e0309cf91d9c1f480"} Mar 09 09:30:00 crc kubenswrapper[4792]: I0309 09:30:00.969448 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550810-8zk6b"] Mar 09 09:30:00 crc kubenswrapper[4792]: W0309 09:30:00.979886 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac96eb08_404b_4da1_aecd_ae86494f9165.slice/crio-c1a70ac375e419623efb3376e8f4956bfe18221488d04a0b0fbb4f10a2d1ddc6 WatchSource:0}: Error finding container c1a70ac375e419623efb3376e8f4956bfe18221488d04a0b0fbb4f10a2d1ddc6: Status 404 returned error can't find the container with id c1a70ac375e419623efb3376e8f4956bfe18221488d04a0b0fbb4f10a2d1ddc6 Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.071086 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.198381 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wvr8\" (UniqueName: \"kubernetes.io/projected/e842d045-4a00-445d-a41d-7f0b3677b6c5-kube-api-access-2wvr8\") pod \"e842d045-4a00-445d-a41d-7f0b3677b6c5\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.198498 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-ceilometer-tls-certs\") pod \"e842d045-4a00-445d-a41d-7f0b3677b6c5\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.198598 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-config-data\") pod \"e842d045-4a00-445d-a41d-7f0b3677b6c5\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.198671 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-sg-core-conf-yaml\") pod \"e842d045-4a00-445d-a41d-7f0b3677b6c5\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.198707 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e842d045-4a00-445d-a41d-7f0b3677b6c5-run-httpd\") pod \"e842d045-4a00-445d-a41d-7f0b3677b6c5\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.198735 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e842d045-4a00-445d-a41d-7f0b3677b6c5-log-httpd\") pod \"e842d045-4a00-445d-a41d-7f0b3677b6c5\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.198757 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-combined-ca-bundle\") pod \"e842d045-4a00-445d-a41d-7f0b3677b6c5\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.198840 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-scripts\") pod \"e842d045-4a00-445d-a41d-7f0b3677b6c5\" (UID: \"e842d045-4a00-445d-a41d-7f0b3677b6c5\") " Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.200405 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e842d045-4a00-445d-a41d-7f0b3677b6c5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e842d045-4a00-445d-a41d-7f0b3677b6c5" (UID: "e842d045-4a00-445d-a41d-7f0b3677b6c5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.200620 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e842d045-4a00-445d-a41d-7f0b3677b6c5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e842d045-4a00-445d-a41d-7f0b3677b6c5" (UID: "e842d045-4a00-445d-a41d-7f0b3677b6c5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.215631 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-scripts" (OuterVolumeSpecName: "scripts") pod "e842d045-4a00-445d-a41d-7f0b3677b6c5" (UID: "e842d045-4a00-445d-a41d-7f0b3677b6c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.216308 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e842d045-4a00-445d-a41d-7f0b3677b6c5-kube-api-access-2wvr8" (OuterVolumeSpecName: "kube-api-access-2wvr8") pod "e842d045-4a00-445d-a41d-7f0b3677b6c5" (UID: "e842d045-4a00-445d-a41d-7f0b3677b6c5"). InnerVolumeSpecName "kube-api-access-2wvr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.227990 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e842d045-4a00-445d-a41d-7f0b3677b6c5" (UID: "e842d045-4a00-445d-a41d-7f0b3677b6c5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:01 crc kubenswrapper[4792]: W0309 09:30:01.242275 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9388ba35_9a13_455c_b8d3_1e19d6ed7c94.slice/crio-594825d11acbb8244abe2aa6cfbb28feb52c8b9d949b382561428b388e7cc823 WatchSource:0}: Error finding container 594825d11acbb8244abe2aa6cfbb28feb52c8b9d949b382561428b388e7cc823: Status 404 returned error can't find the container with id 594825d11acbb8244abe2aa6cfbb28feb52c8b9d949b382561428b388e7cc823 Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.244830 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75"] Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.270254 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e842d045-4a00-445d-a41d-7f0b3677b6c5" (UID: "e842d045-4a00-445d-a41d-7f0b3677b6c5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.300799 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.300863 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e842d045-4a00-445d-a41d-7f0b3677b6c5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.300872 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e842d045-4a00-445d-a41d-7f0b3677b6c5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.300880 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.300889 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wvr8\" (UniqueName: \"kubernetes.io/projected/e842d045-4a00-445d-a41d-7f0b3677b6c5-kube-api-access-2wvr8\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.300902 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.313494 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e842d045-4a00-445d-a41d-7f0b3677b6c5" (UID: "e842d045-4a00-445d-a41d-7f0b3677b6c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.323516 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-config-data" (OuterVolumeSpecName: "config-data") pod "e842d045-4a00-445d-a41d-7f0b3677b6c5" (UID: "e842d045-4a00-445d-a41d-7f0b3677b6c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.402638 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.402857 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e842d045-4a00-445d-a41d-7f0b3677b6c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.678585 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ede6e819-d6bc-48e7-9fcd-4534b432fab5" path="/var/lib/kubelet/pods/ede6e819-d6bc-48e7-9fcd-4534b432fab5/volumes" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.846334 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.846681 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e842d045-4a00-445d-a41d-7f0b3677b6c5","Type":"ContainerDied","Data":"4311c458f67108d32b0d3523ed6bc0b6203e3c71f44468aaae5adeb5efc47a79"} Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.846739 4792 scope.go:117] "RemoveContainer" containerID="160871ec463dd4d80a7778db65efa1677bb77c88796655b2e1ab5f78918bc0ef" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.848140 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" event={"ID":"9388ba35-9a13-455c-b8d3-1e19d6ed7c94","Type":"ContainerStarted","Data":"1e3679387ae30ae01d8933c638bb854d979e89f76fd38bd91dec3c98af56717a"} Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.848185 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" event={"ID":"9388ba35-9a13-455c-b8d3-1e19d6ed7c94","Type":"ContainerStarted","Data":"594825d11acbb8244abe2aa6cfbb28feb52c8b9d949b382561428b388e7cc823"} Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.850879 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550810-8zk6b" event={"ID":"ac96eb08-404b-4da1-aecd-ae86494f9165","Type":"ContainerStarted","Data":"c1a70ac375e419623efb3376e8f4956bfe18221488d04a0b0fbb4f10a2d1ddc6"} Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.891304 4792 scope.go:117] "RemoveContainer" containerID="10021775b2e3e8194b8c36d5fe8a1f67fb053be88f7375784301ddd1d8fdba6d" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.908704 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.912539 4792 scope.go:117] "RemoveContainer" containerID="8bb967386b76c873886b3e36968d9a80bf24314761c470a72d43454be3de521c" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.921268 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.942198 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.942678 4792 scope.go:117] "RemoveContainer" containerID="00fce43351ae1ae9c264222af34886f136ff1ce1456c169e0309cf91d9c1f480" Mar 09 09:30:01 crc kubenswrapper[4792]: E0309 09:30:01.942752 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="sg-core" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.942773 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="sg-core" Mar 09 09:30:01 crc kubenswrapper[4792]: E0309 09:30:01.942790 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="ceilometer-notification-agent" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.942798 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="ceilometer-notification-agent" Mar 09 09:30:01 crc kubenswrapper[4792]: E0309 09:30:01.942809 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="proxy-httpd" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.942817 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="proxy-httpd" Mar 09 09:30:01 crc kubenswrapper[4792]: E0309 09:30:01.942860 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="ceilometer-central-agent" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.942883 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="ceilometer-central-agent" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.943078 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="sg-core" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.943097 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="proxy-httpd" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.943110 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="ceilometer-central-agent" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.943123 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" containerName="ceilometer-notification-agent" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.944947 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.950638 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.951016 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.951474 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 09:30:01 crc kubenswrapper[4792]: I0309 09:30:01.951661 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.016396 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqcqb\" (UniqueName: \"kubernetes.io/projected/50893660-98fd-4a26-a00a-2fcc6a2c3e51-kube-api-access-gqcqb\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.016576 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.016706 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-scripts\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.016727 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50893660-98fd-4a26-a00a-2fcc6a2c3e51-run-httpd\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.016771 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.016849 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50893660-98fd-4a26-a00a-2fcc6a2c3e51-log-httpd\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.016975 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.017050 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-config-data\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.119302 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.119581 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-scripts\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.119745 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50893660-98fd-4a26-a00a-2fcc6a2c3e51-run-httpd\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.119890 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.120024 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50893660-98fd-4a26-a00a-2fcc6a2c3e51-log-httpd\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.120228 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.120340 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-config-data\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.120475 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqcqb\" (UniqueName: \"kubernetes.io/projected/50893660-98fd-4a26-a00a-2fcc6a2c3e51-kube-api-access-gqcqb\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.121702 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50893660-98fd-4a26-a00a-2fcc6a2c3e51-log-httpd\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.122092 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50893660-98fd-4a26-a00a-2fcc6a2c3e51-run-httpd\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.127876 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.128130 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.128384 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-config-data\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.138035 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-scripts\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.138658 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqcqb\" (UniqueName: \"kubernetes.io/projected/50893660-98fd-4a26-a00a-2fcc6a2c3e51-kube-api-access-gqcqb\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.141818 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.264301 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.852779 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.865994 4792 generic.go:334] "Generic (PLEG): container finished" podID="9388ba35-9a13-455c-b8d3-1e19d6ed7c94" containerID="1e3679387ae30ae01d8933c638bb854d979e89f76fd38bd91dec3c98af56717a" exitCode=0 Mar 09 09:30:02 crc kubenswrapper[4792]: I0309 09:30:02.866482 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" event={"ID":"9388ba35-9a13-455c-b8d3-1e19d6ed7c94","Type":"ContainerDied","Data":"1e3679387ae30ae01d8933c638bb854d979e89f76fd38bd91dec3c98af56717a"} Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.216910 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.356321 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8465r\" (UniqueName: \"kubernetes.io/projected/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-kube-api-access-8465r\") pod \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\" (UID: \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\") " Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.356808 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-config-volume\") pod \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\" (UID: \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\") " Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.356859 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-secret-volume\") pod \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\" (UID: \"9388ba35-9a13-455c-b8d3-1e19d6ed7c94\") " Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.360357 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-config-volume" (OuterVolumeSpecName: "config-volume") pod "9388ba35-9a13-455c-b8d3-1e19d6ed7c94" (UID: "9388ba35-9a13-455c-b8d3-1e19d6ed7c94"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.367035 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9388ba35-9a13-455c-b8d3-1e19d6ed7c94" (UID: "9388ba35-9a13-455c-b8d3-1e19d6ed7c94"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.380548 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-kube-api-access-8465r" (OuterVolumeSpecName: "kube-api-access-8465r") pod "9388ba35-9a13-455c-b8d3-1e19d6ed7c94" (UID: "9388ba35-9a13-455c-b8d3-1e19d6ed7c94"). InnerVolumeSpecName "kube-api-access-8465r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.459584 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.459642 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.459657 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8465r\" (UniqueName: \"kubernetes.io/projected/9388ba35-9a13-455c-b8d3-1e19d6ed7c94-kube-api-access-8465r\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.672684 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e842d045-4a00-445d-a41d-7f0b3677b6c5" path="/var/lib/kubelet/pods/e842d045-4a00-445d-a41d-7f0b3677b6c5/volumes" Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.875198 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550810-8zk6b" event={"ID":"ac96eb08-404b-4da1-aecd-ae86494f9165","Type":"ContainerStarted","Data":"e71af4a647d960250e360fd6ab2fe5dda9c3ec25e26b9e22634df30d9890218a"} Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.877884 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.877885 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75" event={"ID":"9388ba35-9a13-455c-b8d3-1e19d6ed7c94","Type":"ContainerDied","Data":"594825d11acbb8244abe2aa6cfbb28feb52c8b9d949b382561428b388e7cc823"} Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.877975 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="594825d11acbb8244abe2aa6cfbb28feb52c8b9d949b382561428b388e7cc823" Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.880027 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50893660-98fd-4a26-a00a-2fcc6a2c3e51","Type":"ContainerStarted","Data":"6e74c5c0aec7e82f060e96a145c34bc8be6b4ab436a35ce663ceb75d398e1ebf"} Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.880082 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50893660-98fd-4a26-a00a-2fcc6a2c3e51","Type":"ContainerStarted","Data":"7edefcb855eb5be8f6395680293370f33f797f2898f6b1e906abec4791f90590"} Mar 09 09:30:03 crc kubenswrapper[4792]: I0309 09:30:03.908380 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550810-8zk6b" podStartSLOduration=1.603786254 podStartE2EDuration="3.908363411s" podCreationTimestamp="2026-03-09 09:30:00 +0000 UTC" firstStartedPulling="2026-03-09 09:30:00.982817787 +0000 UTC m=+1366.013018539" lastFinishedPulling="2026-03-09 09:30:03.287394944 +0000 UTC m=+1368.317595696" observedRunningTime="2026-03-09 09:30:03.90092662 +0000 UTC m=+1368.931127372" watchObservedRunningTime="2026-03-09 09:30:03.908363411 +0000 UTC m=+1368.938564163" Mar 09 09:30:04 crc kubenswrapper[4792]: I0309 09:30:04.121769 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 09:30:04 crc kubenswrapper[4792]: I0309 09:30:04.122393 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 09:30:04 crc kubenswrapper[4792]: I0309 09:30:04.135473 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 09:30:04 crc kubenswrapper[4792]: I0309 09:30:04.893110 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50893660-98fd-4a26-a00a-2fcc6a2c3e51","Type":"ContainerStarted","Data":"29cce57973597b1af8bf3d7c6a08ebd6edea7c673c7333774ec96f629fec0da9"} Mar 09 09:30:04 crc kubenswrapper[4792]: I0309 09:30:04.894647 4792 generic.go:334] "Generic (PLEG): container finished" podID="ac96eb08-404b-4da1-aecd-ae86494f9165" containerID="e71af4a647d960250e360fd6ab2fe5dda9c3ec25e26b9e22634df30d9890218a" exitCode=0 Mar 09 09:30:04 crc kubenswrapper[4792]: I0309 09:30:04.894707 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550810-8zk6b" event={"ID":"ac96eb08-404b-4da1-aecd-ae86494f9165","Type":"ContainerDied","Data":"e71af4a647d960250e360fd6ab2fe5dda9c3ec25e26b9e22634df30d9890218a"} Mar 09 09:30:04 crc kubenswrapper[4792]: I0309 09:30:04.899601 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1399ead-26d7-4512-bbf0-f004f5b95c70" containerID="5b97c962d6c8dfe919ecfc90faca3416e3dd5be66b006567cd5445fa68fc9ed2" exitCode=0 Mar 09 09:30:04 crc kubenswrapper[4792]: I0309 09:30:04.900768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dn7pc" event={"ID":"f1399ead-26d7-4512-bbf0-f004f5b95c70","Type":"ContainerDied","Data":"5b97c962d6c8dfe919ecfc90faca3416e3dd5be66b006567cd5445fa68fc9ed2"} Mar 09 09:30:04 crc kubenswrapper[4792]: I0309 09:30:04.910863 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 09:30:04 crc kubenswrapper[4792]: I0309 09:30:04.995505 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:30:05 crc kubenswrapper[4792]: I0309 09:30:05.006556 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:30:05 crc kubenswrapper[4792]: I0309 09:30:05.939759 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50893660-98fd-4a26-a00a-2fcc6a2c3e51","Type":"ContainerStarted","Data":"3fcd2b98b5ede8c002bfce6e124d3c62e4a6453f75f4bc72768961e341435762"} Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.498133 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.506307 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550810-8zk6b" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.516869 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="de065134-a827-48ff-8694-948775349bd8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.517659 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="de065134-a827-48ff-8694-948775349bd8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.648717 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-scripts\") pod \"f1399ead-26d7-4512-bbf0-f004f5b95c70\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.648771 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks65l\" (UniqueName: \"kubernetes.io/projected/ac96eb08-404b-4da1-aecd-ae86494f9165-kube-api-access-ks65l\") pod \"ac96eb08-404b-4da1-aecd-ae86494f9165\" (UID: \"ac96eb08-404b-4da1-aecd-ae86494f9165\") " Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.648800 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-combined-ca-bundle\") pod \"f1399ead-26d7-4512-bbf0-f004f5b95c70\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.648921 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trcq5\" (UniqueName: \"kubernetes.io/projected/f1399ead-26d7-4512-bbf0-f004f5b95c70-kube-api-access-trcq5\") pod \"f1399ead-26d7-4512-bbf0-f004f5b95c70\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.648940 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-config-data\") pod \"f1399ead-26d7-4512-bbf0-f004f5b95c70\" (UID: \"f1399ead-26d7-4512-bbf0-f004f5b95c70\") " Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.661325 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac96eb08-404b-4da1-aecd-ae86494f9165-kube-api-access-ks65l" (OuterVolumeSpecName: "kube-api-access-ks65l") pod "ac96eb08-404b-4da1-aecd-ae86494f9165" (UID: "ac96eb08-404b-4da1-aecd-ae86494f9165"). InnerVolumeSpecName "kube-api-access-ks65l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.663456 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-scripts" (OuterVolumeSpecName: "scripts") pod "f1399ead-26d7-4512-bbf0-f004f5b95c70" (UID: "f1399ead-26d7-4512-bbf0-f004f5b95c70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.670176 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1399ead-26d7-4512-bbf0-f004f5b95c70-kube-api-access-trcq5" (OuterVolumeSpecName: "kube-api-access-trcq5") pod "f1399ead-26d7-4512-bbf0-f004f5b95c70" (UID: "f1399ead-26d7-4512-bbf0-f004f5b95c70"). InnerVolumeSpecName "kube-api-access-trcq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.707654 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1399ead-26d7-4512-bbf0-f004f5b95c70" (UID: "f1399ead-26d7-4512-bbf0-f004f5b95c70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.722275 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-config-data" (OuterVolumeSpecName: "config-data") pod "f1399ead-26d7-4512-bbf0-f004f5b95c70" (UID: "f1399ead-26d7-4512-bbf0-f004f5b95c70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.751372 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trcq5\" (UniqueName: \"kubernetes.io/projected/f1399ead-26d7-4512-bbf0-f004f5b95c70-kube-api-access-trcq5\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.751407 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.751418 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.751429 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks65l\" (UniqueName: \"kubernetes.io/projected/ac96eb08-404b-4da1-aecd-ae86494f9165-kube-api-access-ks65l\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.751440 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1399ead-26d7-4512-bbf0-f004f5b95c70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.960841 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50893660-98fd-4a26-a00a-2fcc6a2c3e51","Type":"ContainerStarted","Data":"8763e2c49f7c42e53fd1e133698ee26e3b5d1639228093ea9e69c39d4586bea9"} Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.961251 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.963682 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550810-8zk6b" event={"ID":"ac96eb08-404b-4da1-aecd-ae86494f9165","Type":"ContainerDied","Data":"c1a70ac375e419623efb3376e8f4956bfe18221488d04a0b0fbb4f10a2d1ddc6"} Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.963720 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1a70ac375e419623efb3376e8f4956bfe18221488d04a0b0fbb4f10a2d1ddc6" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.963774 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550810-8zk6b" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.980318 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dn7pc" Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.982201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dn7pc" event={"ID":"f1399ead-26d7-4512-bbf0-f004f5b95c70","Type":"ContainerDied","Data":"676fa51b01426d47e714dbc31c0d7bfb4d2383e68a42ed73e215263ead1eff49"} Mar 09 09:30:06 crc kubenswrapper[4792]: I0309 09:30:06.982238 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="676fa51b01426d47e714dbc31c0d7bfb4d2383e68a42ed73e215263ead1eff49" Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.040293 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550804-98tkj"] Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.050458 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550804-98tkj"] Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.055424 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.179960869 podStartE2EDuration="6.055401272s" podCreationTimestamp="2026-03-09 09:30:01 +0000 UTC" firstStartedPulling="2026-03-09 09:30:02.858272632 +0000 UTC m=+1367.888473384" lastFinishedPulling="2026-03-09 09:30:06.733713035 +0000 UTC m=+1371.763913787" observedRunningTime="2026-03-09 09:30:07.032300085 +0000 UTC m=+1372.062500837" watchObservedRunningTime="2026-03-09 09:30:07.055401272 +0000 UTC m=+1372.085602024" Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.169155 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.169572 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="de065134-a827-48ff-8694-948775349bd8" containerName="nova-api-api" containerID="cri-o://a65a1654defca46995034eef4dd636fc694cd512c099f28c7ce1b1e420c1ad83" gracePeriod=30 Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.169442 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="de065134-a827-48ff-8694-948775349bd8" containerName="nova-api-log" containerID="cri-o://b373b9754cab784bdd41012fbdc1681701f43474ce2de22b9d6101ce2a9be714" gracePeriod=30 Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.182590 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.183485 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="80a8330d-39e6-4b5c-a696-4a1f8e1df1ab" containerName="nova-scheduler-scheduler" containerID="cri-o://fab14509b4d8b0e7ea117d2f55a8e109aa2f1491b049626a48b4e562374876d5" gracePeriod=30 Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.296110 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.672954 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1eec157-3f1a-4ca8-afb9-7dc14b7bc433" path="/var/lib/kubelet/pods/f1eec157-3f1a-4ca8-afb9-7dc14b7bc433/volumes" Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.989783 4792 generic.go:334] "Generic (PLEG): container finished" podID="de065134-a827-48ff-8694-948775349bd8" containerID="b373b9754cab784bdd41012fbdc1681701f43474ce2de22b9d6101ce2a9be714" exitCode=143 Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.990531 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de065134-a827-48ff-8694-948775349bd8","Type":"ContainerDied","Data":"b373b9754cab784bdd41012fbdc1681701f43474ce2de22b9d6101ce2a9be714"} Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.991401 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerName="nova-metadata-metadata" containerID="cri-o://f24f496e8119c37fe02291e1ea7aea6b584c15ab1e6abbc078d529a59d6d3675" gracePeriod=30 Mar 09 09:30:07 crc kubenswrapper[4792]: I0309 09:30:07.991405 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerName="nova-metadata-log" containerID="cri-o://b43fee17507f9dc4486139a74d40156c3f8683fad426c7bbca441336900caebd" gracePeriod=30 Mar 09 09:30:09 crc kubenswrapper[4792]: I0309 09:30:09.011932 4792 generic.go:334] "Generic (PLEG): container finished" podID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerID="b43fee17507f9dc4486139a74d40156c3f8683fad426c7bbca441336900caebd" exitCode=143 Mar 09 09:30:09 crc kubenswrapper[4792]: I0309 09:30:09.012053 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7b96e62-a545-4c9b-946b-b7c6a12ceca6","Type":"ContainerDied","Data":"b43fee17507f9dc4486139a74d40156c3f8683fad426c7bbca441336900caebd"} Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.135755 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": read tcp 10.217.0.2:34786->10.217.0.186:8775: read: connection reset by peer" Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.135774 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": read tcp 10.217.0.2:34798->10.217.0.186:8775: read: connection reset by peer" Mar 09 09:30:11 crc kubenswrapper[4792]: E0309 09:30:11.596822 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fab14509b4d8b0e7ea117d2f55a8e109aa2f1491b049626a48b4e562374876d5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 09:30:11 crc kubenswrapper[4792]: E0309 09:30:11.598484 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fab14509b4d8b0e7ea117d2f55a8e109aa2f1491b049626a48b4e562374876d5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 09:30:11 crc kubenswrapper[4792]: E0309 09:30:11.599637 4792 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fab14509b4d8b0e7ea117d2f55a8e109aa2f1491b049626a48b4e562374876d5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 09:30:11 crc kubenswrapper[4792]: E0309 09:30:11.599669 4792 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="80a8330d-39e6-4b5c-a696-4a1f8e1df1ab" containerName="nova-scheduler-scheduler" Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.733491 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.868218 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-nova-metadata-tls-certs\") pod \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.868270 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-combined-ca-bundle\") pod \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.868345 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-logs\") pod \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.868402 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zldjt\" (UniqueName: \"kubernetes.io/projected/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-kube-api-access-zldjt\") pod \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.868461 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-config-data\") pod \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\" (UID: \"a7b96e62-a545-4c9b-946b-b7c6a12ceca6\") " Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.869975 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-logs" (OuterVolumeSpecName: "logs") pod "a7b96e62-a545-4c9b-946b-b7c6a12ceca6" (UID: "a7b96e62-a545-4c9b-946b-b7c6a12ceca6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.888377 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-kube-api-access-zldjt" (OuterVolumeSpecName: "kube-api-access-zldjt") pod "a7b96e62-a545-4c9b-946b-b7c6a12ceca6" (UID: "a7b96e62-a545-4c9b-946b-b7c6a12ceca6"). InnerVolumeSpecName "kube-api-access-zldjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.938309 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7b96e62-a545-4c9b-946b-b7c6a12ceca6" (UID: "a7b96e62-a545-4c9b-946b-b7c6a12ceca6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.947103 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-config-data" (OuterVolumeSpecName: "config-data") pod "a7b96e62-a545-4c9b-946b-b7c6a12ceca6" (UID: "a7b96e62-a545-4c9b-946b-b7c6a12ceca6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.970210 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zldjt\" (UniqueName: \"kubernetes.io/projected/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-kube-api-access-zldjt\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.970238 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.970250 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.970261 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:11 crc kubenswrapper[4792]: I0309 09:30:11.985932 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a7b96e62-a545-4c9b-946b-b7c6a12ceca6" (UID: "a7b96e62-a545-4c9b-946b-b7c6a12ceca6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.044988 4792 generic.go:334] "Generic (PLEG): container finished" podID="80a8330d-39e6-4b5c-a696-4a1f8e1df1ab" containerID="fab14509b4d8b0e7ea117d2f55a8e109aa2f1491b049626a48b4e562374876d5" exitCode=0 Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.045098 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab","Type":"ContainerDied","Data":"fab14509b4d8b0e7ea117d2f55a8e109aa2f1491b049626a48b4e562374876d5"} Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.047190 4792 generic.go:334] "Generic (PLEG): container finished" podID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerID="f24f496e8119c37fe02291e1ea7aea6b584c15ab1e6abbc078d529a59d6d3675" exitCode=0 Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.047232 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7b96e62-a545-4c9b-946b-b7c6a12ceca6","Type":"ContainerDied","Data":"f24f496e8119c37fe02291e1ea7aea6b584c15ab1e6abbc078d529a59d6d3675"} Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.047266 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.047302 4792 scope.go:117] "RemoveContainer" containerID="f24f496e8119c37fe02291e1ea7aea6b584c15ab1e6abbc078d529a59d6d3675" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.047271 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a7b96e62-a545-4c9b-946b-b7c6a12ceca6","Type":"ContainerDied","Data":"2eb63bf6e97175d5a0553d080ba4afda12fa1d8c826c4fbca9c438cc446fdb22"} Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.072361 4792 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b96e62-a545-4c9b-946b-b7c6a12ceca6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.099123 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.109582 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.112881 4792 scope.go:117] "RemoveContainer" containerID="b43fee17507f9dc4486139a74d40156c3f8683fad426c7bbca441336900caebd" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.135885 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:30:12 crc kubenswrapper[4792]: E0309 09:30:12.144911 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerName="nova-metadata-metadata" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.144949 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerName="nova-metadata-metadata" Mar 09 09:30:12 crc kubenswrapper[4792]: E0309 09:30:12.144972 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac96eb08-404b-4da1-aecd-ae86494f9165" containerName="oc" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.144979 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac96eb08-404b-4da1-aecd-ae86494f9165" containerName="oc" Mar 09 09:30:12 crc kubenswrapper[4792]: E0309 09:30:12.144994 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1399ead-26d7-4512-bbf0-f004f5b95c70" containerName="nova-manage" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.144999 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1399ead-26d7-4512-bbf0-f004f5b95c70" containerName="nova-manage" Mar 09 09:30:12 crc kubenswrapper[4792]: E0309 09:30:12.145015 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9388ba35-9a13-455c-b8d3-1e19d6ed7c94" containerName="collect-profiles" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.145021 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9388ba35-9a13-455c-b8d3-1e19d6ed7c94" containerName="collect-profiles" Mar 09 09:30:12 crc kubenswrapper[4792]: E0309 09:30:12.145046 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerName="nova-metadata-log" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.145052 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerName="nova-metadata-log" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.145235 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9388ba35-9a13-455c-b8d3-1e19d6ed7c94" containerName="collect-profiles" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.145255 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerName="nova-metadata-metadata" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.145265 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" containerName="nova-metadata-log" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.145276 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac96eb08-404b-4da1-aecd-ae86494f9165" containerName="oc" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.145286 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1399ead-26d7-4512-bbf0-f004f5b95c70" containerName="nova-manage" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.146324 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.148942 4792 scope.go:117] "RemoveContainer" containerID="f24f496e8119c37fe02291e1ea7aea6b584c15ab1e6abbc078d529a59d6d3675" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.149339 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.149615 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 09:30:12 crc kubenswrapper[4792]: E0309 09:30:12.149767 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f24f496e8119c37fe02291e1ea7aea6b584c15ab1e6abbc078d529a59d6d3675\": container with ID starting with f24f496e8119c37fe02291e1ea7aea6b584c15ab1e6abbc078d529a59d6d3675 not found: ID does not exist" containerID="f24f496e8119c37fe02291e1ea7aea6b584c15ab1e6abbc078d529a59d6d3675" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.149797 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24f496e8119c37fe02291e1ea7aea6b584c15ab1e6abbc078d529a59d6d3675"} err="failed to get container status \"f24f496e8119c37fe02291e1ea7aea6b584c15ab1e6abbc078d529a59d6d3675\": rpc error: code = NotFound desc = could not find container \"f24f496e8119c37fe02291e1ea7aea6b584c15ab1e6abbc078d529a59d6d3675\": container with ID starting with f24f496e8119c37fe02291e1ea7aea6b584c15ab1e6abbc078d529a59d6d3675 not found: ID does not exist" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.149818 4792 scope.go:117] "RemoveContainer" containerID="b43fee17507f9dc4486139a74d40156c3f8683fad426c7bbca441336900caebd" Mar 09 09:30:12 crc kubenswrapper[4792]: E0309 09:30:12.151669 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43fee17507f9dc4486139a74d40156c3f8683fad426c7bbca441336900caebd\": container with ID starting with b43fee17507f9dc4486139a74d40156c3f8683fad426c7bbca441336900caebd not found: ID does not exist" containerID="b43fee17507f9dc4486139a74d40156c3f8683fad426c7bbca441336900caebd" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.151704 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43fee17507f9dc4486139a74d40156c3f8683fad426c7bbca441336900caebd"} err="failed to get container status \"b43fee17507f9dc4486139a74d40156c3f8683fad426c7bbca441336900caebd\": rpc error: code = NotFound desc = could not find container \"b43fee17507f9dc4486139a74d40156c3f8683fad426c7bbca441336900caebd\": container with ID starting with b43fee17507f9dc4486139a74d40156c3f8683fad426c7bbca441336900caebd not found: ID does not exist" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.158252 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.275728 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba905c80-a1c9-4e8b-9d19-965d91ffb934-config-data\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.275905 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba905c80-a1c9-4e8b-9d19-965d91ffb934-logs\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.275994 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba905c80-a1c9-4e8b-9d19-965d91ffb934-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.277376 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba905c80-a1c9-4e8b-9d19-965d91ffb934-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.277447 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq5kv\" (UniqueName: \"kubernetes.io/projected/ba905c80-a1c9-4e8b-9d19-965d91ffb934-kube-api-access-sq5kv\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.379290 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba905c80-a1c9-4e8b-9d19-965d91ffb934-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.379340 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba905c80-a1c9-4e8b-9d19-965d91ffb934-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.379374 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq5kv\" (UniqueName: \"kubernetes.io/projected/ba905c80-a1c9-4e8b-9d19-965d91ffb934-kube-api-access-sq5kv\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.379443 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba905c80-a1c9-4e8b-9d19-965d91ffb934-config-data\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.379498 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba905c80-a1c9-4e8b-9d19-965d91ffb934-logs\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.379974 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba905c80-a1c9-4e8b-9d19-965d91ffb934-logs\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.387101 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba905c80-a1c9-4e8b-9d19-965d91ffb934-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.388593 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba905c80-a1c9-4e8b-9d19-965d91ffb934-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.389463 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba905c80-a1c9-4e8b-9d19-965d91ffb934-config-data\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.399735 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq5kv\" (UniqueName: \"kubernetes.io/projected/ba905c80-a1c9-4e8b-9d19-965d91ffb934-kube-api-access-sq5kv\") pod \"nova-metadata-0\" (UID: \"ba905c80-a1c9-4e8b-9d19-965d91ffb934\") " pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.464493 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.470131 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.587745 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvqvf\" (UniqueName: \"kubernetes.io/projected/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-kube-api-access-zvqvf\") pod \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\" (UID: \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\") " Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.587902 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-config-data\") pod \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\" (UID: \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\") " Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.588013 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-combined-ca-bundle\") pod \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\" (UID: \"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab\") " Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.596972 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-kube-api-access-zvqvf" (OuterVolumeSpecName: "kube-api-access-zvqvf") pod "80a8330d-39e6-4b5c-a696-4a1f8e1df1ab" (UID: "80a8330d-39e6-4b5c-a696-4a1f8e1df1ab"). InnerVolumeSpecName "kube-api-access-zvqvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.633350 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80a8330d-39e6-4b5c-a696-4a1f8e1df1ab" (UID: "80a8330d-39e6-4b5c-a696-4a1f8e1df1ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.640275 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-config-data" (OuterVolumeSpecName: "config-data") pod "80a8330d-39e6-4b5c-a696-4a1f8e1df1ab" (UID: "80a8330d-39e6-4b5c-a696-4a1f8e1df1ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.697169 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvqvf\" (UniqueName: \"kubernetes.io/projected/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-kube-api-access-zvqvf\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.697196 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:12 crc kubenswrapper[4792]: I0309 09:30:12.697206 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.027010 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.063025 4792 generic.go:334] "Generic (PLEG): container finished" podID="de065134-a827-48ff-8694-948775349bd8" containerID="a65a1654defca46995034eef4dd636fc694cd512c099f28c7ce1b1e420c1ad83" exitCode=0 Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.063512 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de065134-a827-48ff-8694-948775349bd8","Type":"ContainerDied","Data":"a65a1654defca46995034eef4dd636fc694cd512c099f28c7ce1b1e420c1ad83"} Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.070389 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.070391 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80a8330d-39e6-4b5c-a696-4a1f8e1df1ab","Type":"ContainerDied","Data":"4b627b47d64369c79111c1273048c7a7a04db8cf18db5ed827472650517b8d97"} Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.070481 4792 scope.go:117] "RemoveContainer" containerID="fab14509b4d8b0e7ea117d2f55a8e109aa2f1491b049626a48b4e562374876d5" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.074696 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba905c80-a1c9-4e8b-9d19-965d91ffb934","Type":"ContainerStarted","Data":"da2a9b0dd3eb4773ed64581f9e6e64bea63f5ba0588f24a795a514402cd8d228"} Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.081411 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.209270 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-config-data\") pod \"de065134-a827-48ff-8694-948775349bd8\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.209348 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp846\" (UniqueName: \"kubernetes.io/projected/de065134-a827-48ff-8694-948775349bd8-kube-api-access-pp846\") pod \"de065134-a827-48ff-8694-948775349bd8\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.211299 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.211396 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-internal-tls-certs\") pod \"de065134-a827-48ff-8694-948775349bd8\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.211524 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-public-tls-certs\") pod \"de065134-a827-48ff-8694-948775349bd8\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.211567 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de065134-a827-48ff-8694-948775349bd8-logs\") pod \"de065134-a827-48ff-8694-948775349bd8\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.211609 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-combined-ca-bundle\") pod \"de065134-a827-48ff-8694-948775349bd8\" (UID: \"de065134-a827-48ff-8694-948775349bd8\") " Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.219809 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de065134-a827-48ff-8694-948775349bd8-logs" (OuterVolumeSpecName: "logs") pod "de065134-a827-48ff-8694-948775349bd8" (UID: "de065134-a827-48ff-8694-948775349bd8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.220484 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.222079 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.227791 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de065134-a827-48ff-8694-948775349bd8-kube-api-access-pp846" (OuterVolumeSpecName: "kube-api-access-pp846") pod "de065134-a827-48ff-8694-948775349bd8" (UID: "de065134-a827-48ff-8694-948775349bd8"). InnerVolumeSpecName "kube-api-access-pp846". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.242103 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.254741 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:30:13 crc kubenswrapper[4792]: E0309 09:30:13.256627 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de065134-a827-48ff-8694-948775349bd8" containerName="nova-api-api" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.256725 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="de065134-a827-48ff-8694-948775349bd8" containerName="nova-api-api" Mar 09 09:30:13 crc kubenswrapper[4792]: E0309 09:30:13.256753 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a8330d-39e6-4b5c-a696-4a1f8e1df1ab" containerName="nova-scheduler-scheduler" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.256760 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a8330d-39e6-4b5c-a696-4a1f8e1df1ab" containerName="nova-scheduler-scheduler" Mar 09 09:30:13 crc kubenswrapper[4792]: E0309 09:30:13.256775 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de065134-a827-48ff-8694-948775349bd8" containerName="nova-api-log" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.256800 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="de065134-a827-48ff-8694-948775349bd8" containerName="nova-api-log" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.257001 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="de065134-a827-48ff-8694-948775349bd8" containerName="nova-api-log" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.257050 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a8330d-39e6-4b5c-a696-4a1f8e1df1ab" containerName="nova-scheduler-scheduler" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.257087 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="de065134-a827-48ff-8694-948775349bd8" containerName="nova-api-api" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.258302 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.261740 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.267739 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.316053 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de087a24-d54a-442c-8cbe-2cbe653c4343-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"de087a24-d54a-442c-8cbe-2cbe653c4343\") " pod="openstack/nova-scheduler-0" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.316249 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cprz\" (UniqueName: \"kubernetes.io/projected/de087a24-d54a-442c-8cbe-2cbe653c4343-kube-api-access-5cprz\") pod \"nova-scheduler-0\" (UID: \"de087a24-d54a-442c-8cbe-2cbe653c4343\") " pod="openstack/nova-scheduler-0" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.316291 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de087a24-d54a-442c-8cbe-2cbe653c4343-config-data\") pod \"nova-scheduler-0\" (UID: \"de087a24-d54a-442c-8cbe-2cbe653c4343\") " pod="openstack/nova-scheduler-0" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.316598 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de065134-a827-48ff-8694-948775349bd8-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.316654 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp846\" (UniqueName: \"kubernetes.io/projected/de065134-a827-48ff-8694-948775349bd8-kube-api-access-pp846\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.324781 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-config-data" (OuterVolumeSpecName: "config-data") pod "de065134-a827-48ff-8694-948775349bd8" (UID: "de065134-a827-48ff-8694-948775349bd8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.346916 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de065134-a827-48ff-8694-948775349bd8" (UID: "de065134-a827-48ff-8694-948775349bd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.354015 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "de065134-a827-48ff-8694-948775349bd8" (UID: "de065134-a827-48ff-8694-948775349bd8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.376545 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "de065134-a827-48ff-8694-948775349bd8" (UID: "de065134-a827-48ff-8694-948775349bd8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.418268 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de087a24-d54a-442c-8cbe-2cbe653c4343-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"de087a24-d54a-442c-8cbe-2cbe653c4343\") " pod="openstack/nova-scheduler-0" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.418351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cprz\" (UniqueName: \"kubernetes.io/projected/de087a24-d54a-442c-8cbe-2cbe653c4343-kube-api-access-5cprz\") pod \"nova-scheduler-0\" (UID: \"de087a24-d54a-442c-8cbe-2cbe653c4343\") " pod="openstack/nova-scheduler-0" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.418378 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de087a24-d54a-442c-8cbe-2cbe653c4343-config-data\") pod \"nova-scheduler-0\" (UID: \"de087a24-d54a-442c-8cbe-2cbe653c4343\") " pod="openstack/nova-scheduler-0" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.418501 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.418515 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.418524 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.418532 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de065134-a827-48ff-8694-948775349bd8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.425653 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de087a24-d54a-442c-8cbe-2cbe653c4343-config-data\") pod \"nova-scheduler-0\" (UID: \"de087a24-d54a-442c-8cbe-2cbe653c4343\") " pod="openstack/nova-scheduler-0" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.426219 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de087a24-d54a-442c-8cbe-2cbe653c4343-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"de087a24-d54a-442c-8cbe-2cbe653c4343\") " pod="openstack/nova-scheduler-0" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.439041 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cprz\" (UniqueName: \"kubernetes.io/projected/de087a24-d54a-442c-8cbe-2cbe653c4343-kube-api-access-5cprz\") pod \"nova-scheduler-0\" (UID: \"de087a24-d54a-442c-8cbe-2cbe653c4343\") " pod="openstack/nova-scheduler-0" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.528309 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.676929 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80a8330d-39e6-4b5c-a696-4a1f8e1df1ab" path="/var/lib/kubelet/pods/80a8330d-39e6-4b5c-a696-4a1f8e1df1ab/volumes" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.677977 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7b96e62-a545-4c9b-946b-b7c6a12ceca6" path="/var/lib/kubelet/pods/a7b96e62-a545-4c9b-946b-b7c6a12ceca6/volumes" Mar 09 09:30:13 crc kubenswrapper[4792]: I0309 09:30:13.973669 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.085326 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de087a24-d54a-442c-8cbe-2cbe653c4343","Type":"ContainerStarted","Data":"a3f3c1d94ddecf4bef7eb3e43f6f289b69bbe6ded2cfec7ed7293690a5d2605e"} Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.087687 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"de065134-a827-48ff-8694-948775349bd8","Type":"ContainerDied","Data":"a19b6bb6f282df652224b846b7a7ea496ab71e9624d2aeffb78e6b4b1fe97da7"} Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.087717 4792 scope.go:117] "RemoveContainer" containerID="a65a1654defca46995034eef4dd636fc694cd512c099f28c7ce1b1e420c1ad83" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.087789 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.097999 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba905c80-a1c9-4e8b-9d19-965d91ffb934","Type":"ContainerStarted","Data":"4c4d0b71fcd3e3736e3993962b16b7811df3f7838ba1ea429f48dea78bd7bc30"} Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.098034 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba905c80-a1c9-4e8b-9d19-965d91ffb934","Type":"ContainerStarted","Data":"76677c5764d1698d4076e21f887b86210c70a2a4a43cc021aad746f6cbfb64af"} Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.134254 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.134281 4792 scope.go:117] "RemoveContainer" containerID="b373b9754cab784bdd41012fbdc1681701f43474ce2de22b9d6101ce2a9be714" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.142987 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.152697 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.154319 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.162655 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.162841 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.163294 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.191580 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.191555387 podStartE2EDuration="2.191555387s" podCreationTimestamp="2026-03-09 09:30:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:30:14.171777969 +0000 UTC m=+1379.201978721" watchObservedRunningTime="2026-03-09 09:30:14.191555387 +0000 UTC m=+1379.221756139" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.218113 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.234152 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/882a66ed-9e4e-4501-90ab-a600db85728a-logs\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.234247 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/882a66ed-9e4e-4501-90ab-a600db85728a-public-tls-certs\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.234313 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/882a66ed-9e4e-4501-90ab-a600db85728a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.234364 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vdsw\" (UniqueName: \"kubernetes.io/projected/882a66ed-9e4e-4501-90ab-a600db85728a-kube-api-access-7vdsw\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.234401 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882a66ed-9e4e-4501-90ab-a600db85728a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.234427 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882a66ed-9e4e-4501-90ab-a600db85728a-config-data\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.335783 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/882a66ed-9e4e-4501-90ab-a600db85728a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.336325 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vdsw\" (UniqueName: \"kubernetes.io/projected/882a66ed-9e4e-4501-90ab-a600db85728a-kube-api-access-7vdsw\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.336428 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882a66ed-9e4e-4501-90ab-a600db85728a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.336523 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882a66ed-9e4e-4501-90ab-a600db85728a-config-data\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.336612 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/882a66ed-9e4e-4501-90ab-a600db85728a-logs\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.336705 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/882a66ed-9e4e-4501-90ab-a600db85728a-public-tls-certs\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.337001 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/882a66ed-9e4e-4501-90ab-a600db85728a-logs\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.349888 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/882a66ed-9e4e-4501-90ab-a600db85728a-public-tls-certs\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.349903 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882a66ed-9e4e-4501-90ab-a600db85728a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.349965 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882a66ed-9e4e-4501-90ab-a600db85728a-config-data\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.349968 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/882a66ed-9e4e-4501-90ab-a600db85728a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.353546 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vdsw\" (UniqueName: \"kubernetes.io/projected/882a66ed-9e4e-4501-90ab-a600db85728a-kube-api-access-7vdsw\") pod \"nova-api-0\" (UID: \"882a66ed-9e4e-4501-90ab-a600db85728a\") " pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.492558 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:30:14 crc kubenswrapper[4792]: I0309 09:30:14.973097 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:30:14 crc kubenswrapper[4792]: W0309 09:30:14.976154 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod882a66ed_9e4e_4501_90ab_a600db85728a.slice/crio-06a14ff135987465ee3b1037acbfe11effec5bc99eea5dfabcbbe04537105af4 WatchSource:0}: Error finding container 06a14ff135987465ee3b1037acbfe11effec5bc99eea5dfabcbbe04537105af4: Status 404 returned error can't find the container with id 06a14ff135987465ee3b1037acbfe11effec5bc99eea5dfabcbbe04537105af4 Mar 09 09:30:15 crc kubenswrapper[4792]: I0309 09:30:15.111187 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"de087a24-d54a-442c-8cbe-2cbe653c4343","Type":"ContainerStarted","Data":"6fe9d54e266b1a53b404cf48c9b8be05d0a04a60b21a2411153ab67ecc4ce434"} Mar 09 09:30:15 crc kubenswrapper[4792]: I0309 09:30:15.116859 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"882a66ed-9e4e-4501-90ab-a600db85728a","Type":"ContainerStarted","Data":"06a14ff135987465ee3b1037acbfe11effec5bc99eea5dfabcbbe04537105af4"} Mar 09 09:30:15 crc kubenswrapper[4792]: I0309 09:30:15.135545 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.135530231 podStartE2EDuration="2.135530231s" podCreationTimestamp="2026-03-09 09:30:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:30:15.128042758 +0000 UTC m=+1380.158243510" watchObservedRunningTime="2026-03-09 09:30:15.135530231 +0000 UTC m=+1380.165730983" Mar 09 09:30:15 crc kubenswrapper[4792]: I0309 09:30:15.679125 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de065134-a827-48ff-8694-948775349bd8" path="/var/lib/kubelet/pods/de065134-a827-48ff-8694-948775349bd8/volumes" Mar 09 09:30:16 crc kubenswrapper[4792]: I0309 09:30:16.128686 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"882a66ed-9e4e-4501-90ab-a600db85728a","Type":"ContainerStarted","Data":"04ef2bfac47339cde48be274b76931c8c3c4ee19d67eb583232005efb88408ac"} Mar 09 09:30:16 crc kubenswrapper[4792]: I0309 09:30:16.128735 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"882a66ed-9e4e-4501-90ab-a600db85728a","Type":"ContainerStarted","Data":"6e351874c55e932a2d32d895eb060c25fedc7020323ff7d31f685aa5f06f556a"} Mar 09 09:30:16 crc kubenswrapper[4792]: I0309 09:30:16.153389 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.153367011 podStartE2EDuration="2.153367011s" podCreationTimestamp="2026-03-09 09:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:30:16.147022042 +0000 UTC m=+1381.177222804" watchObservedRunningTime="2026-03-09 09:30:16.153367011 +0000 UTC m=+1381.183567773" Mar 09 09:30:17 crc kubenswrapper[4792]: I0309 09:30:17.471426 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 09:30:17 crc kubenswrapper[4792]: I0309 09:30:17.471758 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 09:30:18 crc kubenswrapper[4792]: I0309 09:30:18.528830 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 09:30:22 crc kubenswrapper[4792]: I0309 09:30:22.471707 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 09:30:22 crc kubenswrapper[4792]: I0309 09:30:22.472315 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 09:30:23 crc kubenswrapper[4792]: I0309 09:30:23.481642 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ba905c80-a1c9-4e8b-9d19-965d91ffb934" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:30:23 crc kubenswrapper[4792]: I0309 09:30:23.491277 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ba905c80-a1c9-4e8b-9d19-965d91ffb934" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:30:23 crc kubenswrapper[4792]: I0309 09:30:23.529723 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 09:30:23 crc kubenswrapper[4792]: I0309 09:30:23.556866 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 09:30:24 crc kubenswrapper[4792]: I0309 09:30:24.240474 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 09:30:24 crc kubenswrapper[4792]: I0309 09:30:24.492657 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:30:24 crc kubenswrapper[4792]: I0309 09:30:24.492698 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:30:25 crc kubenswrapper[4792]: I0309 09:30:25.509377 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="882a66ed-9e4e-4501-90ab-a600db85728a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.197:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:30:25 crc kubenswrapper[4792]: I0309 09:30:25.509381 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="882a66ed-9e4e-4501-90ab-a600db85728a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.197:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:30:32 crc kubenswrapper[4792]: I0309 09:30:32.284331 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 09:30:32 crc kubenswrapper[4792]: I0309 09:30:32.483794 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 09:30:32 crc kubenswrapper[4792]: I0309 09:30:32.485036 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 09:30:32 crc kubenswrapper[4792]: I0309 09:30:32.497766 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 09:30:32 crc kubenswrapper[4792]: I0309 09:30:32.852664 4792 scope.go:117] "RemoveContainer" containerID="97a93524fa266ceaa10e471f79bcccc036fd79b65c803c32c104f10b86e35c1d" Mar 09 09:30:33 crc kubenswrapper[4792]: I0309 09:30:33.302871 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 09:30:34 crc kubenswrapper[4792]: I0309 09:30:34.499962 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 09:30:34 crc kubenswrapper[4792]: I0309 09:30:34.500581 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 09:30:34 crc kubenswrapper[4792]: I0309 09:30:34.501248 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 09:30:34 crc kubenswrapper[4792]: I0309 09:30:34.506154 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 09:30:35 crc kubenswrapper[4792]: I0309 09:30:35.316418 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 09:30:35 crc kubenswrapper[4792]: I0309 09:30:35.328203 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 09:30:42 crc kubenswrapper[4792]: I0309 09:30:42.446676 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:30:43 crc kubenswrapper[4792]: I0309 09:30:43.214211 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:30:43 crc kubenswrapper[4792]: I0309 09:30:43.214519 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:30:43 crc kubenswrapper[4792]: I0309 09:30:43.214631 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:30:43 crc kubenswrapper[4792]: I0309 09:30:43.215750 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80c12a8064763d9c808b56945a8c97d0c627b1e8c20ccecc1138b2635c8e12bd"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:30:43 crc kubenswrapper[4792]: I0309 09:30:43.215821 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://80c12a8064763d9c808b56945a8c97d0c627b1e8c20ccecc1138b2635c8e12bd" gracePeriod=600 Mar 09 09:30:43 crc kubenswrapper[4792]: I0309 09:30:43.250814 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:30:43 crc kubenswrapper[4792]: I0309 09:30:43.383385 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="80c12a8064763d9c808b56945a8c97d0c627b1e8c20ccecc1138b2635c8e12bd" exitCode=0 Mar 09 09:30:43 crc kubenswrapper[4792]: I0309 09:30:43.383677 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"80c12a8064763d9c808b56945a8c97d0c627b1e8c20ccecc1138b2635c8e12bd"} Mar 09 09:30:43 crc kubenswrapper[4792]: I0309 09:30:43.383712 4792 scope.go:117] "RemoveContainer" containerID="338559ddc83aaf62922dd4a2c3548afe39aad0e71765a9e21715a3c207fc6015" Mar 09 09:30:44 crc kubenswrapper[4792]: I0309 09:30:44.396141 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92"} Mar 09 09:30:48 crc kubenswrapper[4792]: I0309 09:30:48.395923 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="42b40fb0-d2c9-4bc2-a13f-4c099b244ced" containerName="rabbitmq" containerID="cri-o://6147ea0cd9e780b17277a374eaf8973eb9f7669cf3ea52c694431c1760b967fb" gracePeriod=604795 Mar 09 09:30:48 crc kubenswrapper[4792]: I0309 09:30:48.941663 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0ee86e97-a22c-4089-9ce4-363cb0571173" containerName="rabbitmq" containerID="cri-o://de75c85de9aabbef9dad206b8b8770b3869d0bbae34d43ac2ac1e1373e6245fe" gracePeriod=604795 Mar 09 09:30:53 crc kubenswrapper[4792]: I0309 09:30:53.828842 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0ee86e97-a22c-4089-9ce4-363cb0571173" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Mar 09 09:30:54 crc kubenswrapper[4792]: I0309 09:30:54.275927 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="42b40fb0-d2c9-4bc2-a13f-4c099b244ced" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.060263 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.184206 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-tls\") pod \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.184280 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-erlang-cookie-secret\") pod \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.184350 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm6kj\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-kube-api-access-fm6kj\") pod \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.184390 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-pod-info\") pod \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.184493 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-config-data\") pod \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.184517 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-server-conf\") pod \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.184555 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-plugins\") pod \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.184605 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-confd\") pod \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.184658 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.184685 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-erlang-cookie\") pod \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.184724 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-plugins-conf\") pod \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\" (UID: \"42b40fb0-d2c9-4bc2-a13f-4c099b244ced\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.190875 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "42b40fb0-d2c9-4bc2-a13f-4c099b244ced" (UID: "42b40fb0-d2c9-4bc2-a13f-4c099b244ced"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.191958 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "42b40fb0-d2c9-4bc2-a13f-4c099b244ced" (UID: "42b40fb0-d2c9-4bc2-a13f-4c099b244ced"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.197767 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "42b40fb0-d2c9-4bc2-a13f-4c099b244ced" (UID: "42b40fb0-d2c9-4bc2-a13f-4c099b244ced"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.205118 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "42b40fb0-d2c9-4bc2-a13f-4c099b244ced" (UID: "42b40fb0-d2c9-4bc2-a13f-4c099b244ced"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.227331 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-pod-info" (OuterVolumeSpecName: "pod-info") pod "42b40fb0-d2c9-4bc2-a13f-4c099b244ced" (UID: "42b40fb0-d2c9-4bc2-a13f-4c099b244ced"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.228530 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "42b40fb0-d2c9-4bc2-a13f-4c099b244ced" (UID: "42b40fb0-d2c9-4bc2-a13f-4c099b244ced"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.239380 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "42b40fb0-d2c9-4bc2-a13f-4c099b244ced" (UID: "42b40fb0-d2c9-4bc2-a13f-4c099b244ced"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.239535 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-kube-api-access-fm6kj" (OuterVolumeSpecName: "kube-api-access-fm6kj") pod "42b40fb0-d2c9-4bc2-a13f-4c099b244ced" (UID: "42b40fb0-d2c9-4bc2-a13f-4c099b244ced"). InnerVolumeSpecName "kube-api-access-fm6kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.276724 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-config-data" (OuterVolumeSpecName: "config-data") pod "42b40fb0-d2c9-4bc2-a13f-4c099b244ced" (UID: "42b40fb0-d2c9-4bc2-a13f-4c099b244ced"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.288441 4792 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.288473 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.288484 4792 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.288495 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm6kj\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-kube-api-access-fm6kj\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.288505 4792 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.288514 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.288522 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.288552 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.288561 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.304772 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-server-conf" (OuterVolumeSpecName: "server-conf") pod "42b40fb0-d2c9-4bc2-a13f-4c099b244ced" (UID: "42b40fb0-d2c9-4bc2-a13f-4c099b244ced"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.329971 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.391201 4792 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.391239 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.392918 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "42b40fb0-d2c9-4bc2-a13f-4c099b244ced" (UID: "42b40fb0-d2c9-4bc2-a13f-4c099b244ced"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.488817 4792 generic.go:334] "Generic (PLEG): container finished" podID="0ee86e97-a22c-4089-9ce4-363cb0571173" containerID="de75c85de9aabbef9dad206b8b8770b3869d0bbae34d43ac2ac1e1373e6245fe" exitCode=0 Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.488933 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ee86e97-a22c-4089-9ce4-363cb0571173","Type":"ContainerDied","Data":"de75c85de9aabbef9dad206b8b8770b3869d0bbae34d43ac2ac1e1373e6245fe"} Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.492274 4792 generic.go:334] "Generic (PLEG): container finished" podID="42b40fb0-d2c9-4bc2-a13f-4c099b244ced" containerID="6147ea0cd9e780b17277a374eaf8973eb9f7669cf3ea52c694431c1760b967fb" exitCode=0 Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.492317 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"42b40fb0-d2c9-4bc2-a13f-4c099b244ced","Type":"ContainerDied","Data":"6147ea0cd9e780b17277a374eaf8973eb9f7669cf3ea52c694431c1760b967fb"} Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.492343 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"42b40fb0-d2c9-4bc2-a13f-4c099b244ced","Type":"ContainerDied","Data":"591d12d177417a851e27ab947e142a016d5d0839679e66795a8e5a5cc9c7798a"} Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.492361 4792 scope.go:117] "RemoveContainer" containerID="6147ea0cd9e780b17277a374eaf8973eb9f7669cf3ea52c694431c1760b967fb" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.492501 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.494297 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/42b40fb0-d2c9-4bc2-a13f-4c099b244ced-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.558820 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.571250 4792 scope.go:117] "RemoveContainer" containerID="12ef7a7568725de4169d980eaebeaae0632c46d8f4718c7352b6c167ad607668" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.577459 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.607478 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:30:55 crc kubenswrapper[4792]: E0309 09:30:55.607883 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b40fb0-d2c9-4bc2-a13f-4c099b244ced" containerName="setup-container" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.607897 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b40fb0-d2c9-4bc2-a13f-4c099b244ced" containerName="setup-container" Mar 09 09:30:55 crc kubenswrapper[4792]: E0309 09:30:55.607906 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b40fb0-d2c9-4bc2-a13f-4c099b244ced" containerName="rabbitmq" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.607915 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b40fb0-d2c9-4bc2-a13f-4c099b244ced" containerName="rabbitmq" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.608163 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b40fb0-d2c9-4bc2-a13f-4c099b244ced" containerName="rabbitmq" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.613776 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.618408 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-96c2v" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.618583 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.618698 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.618856 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.619115 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.619264 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.622504 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.658097 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.667689 4792 scope.go:117] "RemoveContainer" containerID="6147ea0cd9e780b17277a374eaf8973eb9f7669cf3ea52c694431c1760b967fb" Mar 09 09:30:55 crc kubenswrapper[4792]: E0309 09:30:55.670216 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6147ea0cd9e780b17277a374eaf8973eb9f7669cf3ea52c694431c1760b967fb\": container with ID starting with 6147ea0cd9e780b17277a374eaf8973eb9f7669cf3ea52c694431c1760b967fb not found: ID does not exist" containerID="6147ea0cd9e780b17277a374eaf8973eb9f7669cf3ea52c694431c1760b967fb" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.670268 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6147ea0cd9e780b17277a374eaf8973eb9f7669cf3ea52c694431c1760b967fb"} err="failed to get container status \"6147ea0cd9e780b17277a374eaf8973eb9f7669cf3ea52c694431c1760b967fb\": rpc error: code = NotFound desc = could not find container \"6147ea0cd9e780b17277a374eaf8973eb9f7669cf3ea52c694431c1760b967fb\": container with ID starting with 6147ea0cd9e780b17277a374eaf8973eb9f7669cf3ea52c694431c1760b967fb not found: ID does not exist" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.670408 4792 scope.go:117] "RemoveContainer" containerID="12ef7a7568725de4169d980eaebeaae0632c46d8f4718c7352b6c167ad607668" Mar 09 09:30:55 crc kubenswrapper[4792]: E0309 09:30:55.680386 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12ef7a7568725de4169d980eaebeaae0632c46d8f4718c7352b6c167ad607668\": container with ID starting with 12ef7a7568725de4169d980eaebeaae0632c46d8f4718c7352b6c167ad607668 not found: ID does not exist" containerID="12ef7a7568725de4169d980eaebeaae0632c46d8f4718c7352b6c167ad607668" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.680434 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ef7a7568725de4169d980eaebeaae0632c46d8f4718c7352b6c167ad607668"} err="failed to get container status \"12ef7a7568725de4169d980eaebeaae0632c46d8f4718c7352b6c167ad607668\": rpc error: code = NotFound desc = could not find container \"12ef7a7568725de4169d980eaebeaae0632c46d8f4718c7352b6c167ad607668\": container with ID starting with 12ef7a7568725de4169d980eaebeaae0632c46d8f4718c7352b6c167ad607668 not found: ID does not exist" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.698378 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a994be4-9a88-4ee6-8e24-a6d62898f593-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.698427 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a994be4-9a88-4ee6-8e24-a6d62898f593-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.698514 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th78n\" (UniqueName: \"kubernetes.io/projected/6a994be4-9a88-4ee6-8e24-a6d62898f593-kube-api-access-th78n\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.698549 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a994be4-9a88-4ee6-8e24-a6d62898f593-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.698566 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a994be4-9a88-4ee6-8e24-a6d62898f593-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.698587 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a994be4-9a88-4ee6-8e24-a6d62898f593-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.698604 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.698629 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a994be4-9a88-4ee6-8e24-a6d62898f593-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.698660 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a994be4-9a88-4ee6-8e24-a6d62898f593-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.698687 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a994be4-9a88-4ee6-8e24-a6d62898f593-config-data\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.699625 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a994be4-9a88-4ee6-8e24-a6d62898f593-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.711495 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b40fb0-d2c9-4bc2-a13f-4c099b244ced" path="/var/lib/kubelet/pods/42b40fb0-d2c9-4bc2-a13f-4c099b244ced/volumes" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.799330 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.801230 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th78n\" (UniqueName: \"kubernetes.io/projected/6a994be4-9a88-4ee6-8e24-a6d62898f593-kube-api-access-th78n\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.801448 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a994be4-9a88-4ee6-8e24-a6d62898f593-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.802425 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a994be4-9a88-4ee6-8e24-a6d62898f593-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.802552 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a994be4-9a88-4ee6-8e24-a6d62898f593-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.802669 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.802805 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a994be4-9a88-4ee6-8e24-a6d62898f593-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.802927 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a994be4-9a88-4ee6-8e24-a6d62898f593-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.803044 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a994be4-9a88-4ee6-8e24-a6d62898f593-config-data\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.803243 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a994be4-9a88-4ee6-8e24-a6d62898f593-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.804368 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6a994be4-9a88-4ee6-8e24-a6d62898f593-config-data\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.804426 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6a994be4-9a88-4ee6-8e24-a6d62898f593-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.805335 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6a994be4-9a88-4ee6-8e24-a6d62898f593-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.805342 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6a994be4-9a88-4ee6-8e24-a6d62898f593-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.805509 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a994be4-9a88-4ee6-8e24-a6d62898f593-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.805614 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.805722 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a994be4-9a88-4ee6-8e24-a6d62898f593-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.806684 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6a994be4-9a88-4ee6-8e24-a6d62898f593-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.808540 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6a994be4-9a88-4ee6-8e24-a6d62898f593-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.819424 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6a994be4-9a88-4ee6-8e24-a6d62898f593-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.820012 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6a994be4-9a88-4ee6-8e24-a6d62898f593-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.823714 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6a994be4-9a88-4ee6-8e24-a6d62898f593-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.839727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th78n\" (UniqueName: \"kubernetes.io/projected/6a994be4-9a88-4ee6-8e24-a6d62898f593-kube-api-access-th78n\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.891112 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"6a994be4-9a88-4ee6-8e24-a6d62898f593\") " pod="openstack/rabbitmq-server-0" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.908570 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0ee86e97-a22c-4089-9ce4-363cb0571173\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.908665 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-server-conf\") pod \"0ee86e97-a22c-4089-9ce4-363cb0571173\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.909437 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-confd\") pod \"0ee86e97-a22c-4089-9ce4-363cb0571173\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.909506 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nzbf\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-kube-api-access-8nzbf\") pod \"0ee86e97-a22c-4089-9ce4-363cb0571173\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.909619 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee86e97-a22c-4089-9ce4-363cb0571173-erlang-cookie-secret\") pod \"0ee86e97-a22c-4089-9ce4-363cb0571173\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.909644 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-erlang-cookie\") pod \"0ee86e97-a22c-4089-9ce4-363cb0571173\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.909792 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee86e97-a22c-4089-9ce4-363cb0571173-pod-info\") pod \"0ee86e97-a22c-4089-9ce4-363cb0571173\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.909860 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-plugins\") pod \"0ee86e97-a22c-4089-9ce4-363cb0571173\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.909935 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-plugins-conf\") pod \"0ee86e97-a22c-4089-9ce4-363cb0571173\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.909964 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-config-data\") pod \"0ee86e97-a22c-4089-9ce4-363cb0571173\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.910019 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-tls\") pod \"0ee86e97-a22c-4089-9ce4-363cb0571173\" (UID: \"0ee86e97-a22c-4089-9ce4-363cb0571173\") " Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.910358 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0ee86e97-a22c-4089-9ce4-363cb0571173" (UID: "0ee86e97-a22c-4089-9ce4-363cb0571173"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.910805 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.911436 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0ee86e97-a22c-4089-9ce4-363cb0571173" (UID: "0ee86e97-a22c-4089-9ce4-363cb0571173"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.911580 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0ee86e97-a22c-4089-9ce4-363cb0571173" (UID: "0ee86e97-a22c-4089-9ce4-363cb0571173"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.918416 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "0ee86e97-a22c-4089-9ce4-363cb0571173" (UID: "0ee86e97-a22c-4089-9ce4-363cb0571173"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.918499 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ee86e97-a22c-4089-9ce4-363cb0571173-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0ee86e97-a22c-4089-9ce4-363cb0571173" (UID: "0ee86e97-a22c-4089-9ce4-363cb0571173"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.934762 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0ee86e97-a22c-4089-9ce4-363cb0571173" (UID: "0ee86e97-a22c-4089-9ce4-363cb0571173"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.940334 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0ee86e97-a22c-4089-9ce4-363cb0571173-pod-info" (OuterVolumeSpecName: "pod-info") pod "0ee86e97-a22c-4089-9ce4-363cb0571173" (UID: "0ee86e97-a22c-4089-9ce4-363cb0571173"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.940891 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-config-data" (OuterVolumeSpecName: "config-data") pod "0ee86e97-a22c-4089-9ce4-363cb0571173" (UID: "0ee86e97-a22c-4089-9ce4-363cb0571173"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:30:55 crc kubenswrapper[4792]: I0309 09:30:55.943317 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-kube-api-access-8nzbf" (OuterVolumeSpecName: "kube-api-access-8nzbf") pod "0ee86e97-a22c-4089-9ce4-363cb0571173" (UID: "0ee86e97-a22c-4089-9ce4-363cb0571173"). InnerVolumeSpecName "kube-api-access-8nzbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.012675 4792 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.012715 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.012728 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.012759 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.012773 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nzbf\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-kube-api-access-8nzbf\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.012787 4792 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0ee86e97-a22c-4089-9ce4-363cb0571173-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.012798 4792 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0ee86e97-a22c-4089-9ce4-363cb0571173-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.012808 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.018043 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-server-conf" (OuterVolumeSpecName: "server-conf") pod "0ee86e97-a22c-4089-9ce4-363cb0571173" (UID: "0ee86e97-a22c-4089-9ce4-363cb0571173"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.032918 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.052909 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0ee86e97-a22c-4089-9ce4-363cb0571173" (UID: "0ee86e97-a22c-4089-9ce4-363cb0571173"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.096612 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.115104 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.115177 4792 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0ee86e97-a22c-4089-9ce4-363cb0571173-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.115199 4792 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0ee86e97-a22c-4089-9ce4-363cb0571173-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.503546 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.503542 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0ee86e97-a22c-4089-9ce4-363cb0571173","Type":"ContainerDied","Data":"447d895ca227edf5d86a3419d838c6872239e8f426e3c2b57ff42f8e6ff1964f"} Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.503953 4792 scope.go:117] "RemoveContainer" containerID="de75c85de9aabbef9dad206b8b8770b3869d0bbae34d43ac2ac1e1373e6245fe" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.540235 4792 scope.go:117] "RemoveContainer" containerID="dbd17cbb8b429cdcb0b12d986092a0771430752ae9708cfa2b6450eb12120d9f" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.620143 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.685512 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.714816 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:30:56 crc kubenswrapper[4792]: E0309 09:30:56.715738 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee86e97-a22c-4089-9ce4-363cb0571173" containerName="setup-container" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.715755 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee86e97-a22c-4089-9ce4-363cb0571173" containerName="setup-container" Mar 09 09:30:56 crc kubenswrapper[4792]: E0309 09:30:56.715779 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee86e97-a22c-4089-9ce4-363cb0571173" containerName="rabbitmq" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.715787 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee86e97-a22c-4089-9ce4-363cb0571173" containerName="rabbitmq" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.716010 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ee86e97-a22c-4089-9ce4-363cb0571173" containerName="rabbitmq" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.717231 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.724108 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.724143 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.724440 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.724454 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-94bgh" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.724588 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.725259 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.725384 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.747160 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.825236 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.843204 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.843255 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.843286 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.843324 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.843368 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.843442 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h7h8\" (UniqueName: \"kubernetes.io/projected/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-kube-api-access-2h7h8\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.843511 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.843553 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.843616 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.843641 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.843658 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.948448 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.948892 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.960018 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.960358 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.960543 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.960782 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h7h8\" (UniqueName: \"kubernetes.io/projected/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-kube-api-access-2h7h8\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.960960 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.961153 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.961271 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.949973 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.948851 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-7cg56"] Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.962319 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.962746 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.963364 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.963496 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.964018 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.964411 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.964672 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.964925 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.970759 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.971687 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.976888 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.977570 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-7cg56"] Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.984904 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.990789 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:56 crc kubenswrapper[4792]: I0309 09:30:56.991784 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h7h8\" (UniqueName: \"kubernetes.io/projected/a20da79f-1b2b-4d52-bf44-4c6a9bf0f210-kube-api-access-2h7h8\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.058591 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.065355 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-openstack-edpm-ipam\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.065460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-config\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.065498 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-ovsdbserver-nb\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.065661 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2wzg\" (UniqueName: \"kubernetes.io/projected/ea361156-3381-4511-8a09-ec3daaee9ef0-kube-api-access-s2wzg\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.065726 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-dns-svc\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.065822 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-ovsdbserver-sb\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.168319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-ovsdbserver-sb\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.168402 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-openstack-edpm-ipam\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.168449 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-config\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.168475 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-ovsdbserver-nb\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.168600 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2wzg\" (UniqueName: \"kubernetes.io/projected/ea361156-3381-4511-8a09-ec3daaee9ef0-kube-api-access-s2wzg\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.168788 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-dns-svc\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.169442 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-ovsdbserver-sb\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.169551 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-openstack-edpm-ipam\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.169753 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-ovsdbserver-nb\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.170652 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-dns-svc\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.171793 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-config\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.192853 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2wzg\" (UniqueName: \"kubernetes.io/projected/ea361156-3381-4511-8a09-ec3daaee9ef0-kube-api-access-s2wzg\") pod \"dnsmasq-dns-59c44489bc-7cg56\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.224358 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.322682 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.527450 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a994be4-9a88-4ee6-8e24-a6d62898f593","Type":"ContainerStarted","Data":"94e36be156d17c304aa2718e6e77fdee50cb71350f4f77877d73c63a4dd4b976"} Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.748147 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee86e97-a22c-4089-9ce4-363cb0571173" path="/var/lib/kubelet/pods/0ee86e97-a22c-4089-9ce4-363cb0571173/volumes" Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.767940 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-7cg56"] Mar 09 09:30:57 crc kubenswrapper[4792]: I0309 09:30:57.852340 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:30:58 crc kubenswrapper[4792]: I0309 09:30:58.568363 4792 generic.go:334] "Generic (PLEG): container finished" podID="ea361156-3381-4511-8a09-ec3daaee9ef0" containerID="aaec01a90e20a677a79d3063eb607ff4e000cc9e9eccf7cde01d76d6b321a3b1" exitCode=0 Mar 09 09:30:58 crc kubenswrapper[4792]: I0309 09:30:58.568786 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-7cg56" event={"ID":"ea361156-3381-4511-8a09-ec3daaee9ef0","Type":"ContainerDied","Data":"aaec01a90e20a677a79d3063eb607ff4e000cc9e9eccf7cde01d76d6b321a3b1"} Mar 09 09:30:58 crc kubenswrapper[4792]: I0309 09:30:58.568819 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-7cg56" event={"ID":"ea361156-3381-4511-8a09-ec3daaee9ef0","Type":"ContainerStarted","Data":"e5beb73756165c0713408a28064146d444246eabdfcc6a4431f42f5823470b53"} Mar 09 09:30:58 crc kubenswrapper[4792]: I0309 09:30:58.586769 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a994be4-9a88-4ee6-8e24-a6d62898f593","Type":"ContainerStarted","Data":"671fa139dd2ef4dac72fada3f77ac5db1589432c9b11c10e26e683187e9e6409"} Mar 09 09:30:58 crc kubenswrapper[4792]: I0309 09:30:58.594382 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210","Type":"ContainerStarted","Data":"e99c25ce9bbe688637bb8343e1cd24e0949df61b390563a7f974f09b2f27d565"} Mar 09 09:30:59 crc kubenswrapper[4792]: I0309 09:30:59.605103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210","Type":"ContainerStarted","Data":"0b0fb83ecd6aec03bfa921c2b3f3e25aa4acf27ecf6d6eff6752d36b011cfbf8"} Mar 09 09:30:59 crc kubenswrapper[4792]: I0309 09:30:59.608592 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-7cg56" event={"ID":"ea361156-3381-4511-8a09-ec3daaee9ef0","Type":"ContainerStarted","Data":"268847f67fe50a21d526607f1c1c0cc48a878d0b3f25ba340b544abb8fb0b209"} Mar 09 09:30:59 crc kubenswrapper[4792]: I0309 09:30:59.670102 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59c44489bc-7cg56" podStartSLOduration=3.670069081 podStartE2EDuration="3.670069081s" podCreationTimestamp="2026-03-09 09:30:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:30:59.658819597 +0000 UTC m=+1424.689020359" watchObservedRunningTime="2026-03-09 09:30:59.670069081 +0000 UTC m=+1424.700269833" Mar 09 09:31:00 crc kubenswrapper[4792]: I0309 09:31:00.616591 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.325352 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.424379 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-6shrv"] Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.424651 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c74598c69-6shrv" podUID="33ae6938-dd8e-4224-9522-a9b8638c8c54" containerName="dnsmasq-dns" containerID="cri-o://e42d0097053816c2b84291655850053d2d6ec8c2668daf65eac099034ab2d71f" gracePeriod=10 Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.750396 4792 generic.go:334] "Generic (PLEG): container finished" podID="33ae6938-dd8e-4224-9522-a9b8638c8c54" containerID="e42d0097053816c2b84291655850053d2d6ec8c2668daf65eac099034ab2d71f" exitCode=0 Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.750723 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-6shrv" event={"ID":"33ae6938-dd8e-4224-9522-a9b8638c8c54","Type":"ContainerDied","Data":"e42d0097053816c2b84291655850053d2d6ec8c2668daf65eac099034ab2d71f"} Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.758663 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb7494899-5jcjx"] Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.776620 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-5jcjx"] Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.776987 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.919974 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-ovsdbserver-sb\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.920023 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-ovsdbserver-nb\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.920067 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.920105 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p26fs\" (UniqueName: \"kubernetes.io/projected/de676155-c813-4600-8042-5434607f00ca-kube-api-access-p26fs\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.920121 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-dns-svc\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:07 crc kubenswrapper[4792]: I0309 09:31:07.920381 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-config\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.022422 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-ovsdbserver-sb\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.022814 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-ovsdbserver-nb\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.022899 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.022954 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p26fs\" (UniqueName: \"kubernetes.io/projected/de676155-c813-4600-8042-5434607f00ca-kube-api-access-p26fs\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.022984 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-dns-svc\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.023129 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-config\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.023464 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-ovsdbserver-sb\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.023898 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.024091 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-dns-svc\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.024753 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-ovsdbserver-nb\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.028798 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-config\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.066362 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p26fs\" (UniqueName: \"kubernetes.io/projected/de676155-c813-4600-8042-5434607f00ca-kube-api-access-p26fs\") pod \"dnsmasq-dns-cb7494899-5jcjx\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.116302 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.157822 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.332049 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-dns-svc\") pod \"33ae6938-dd8e-4224-9522-a9b8638c8c54\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.332183 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcjq5\" (UniqueName: \"kubernetes.io/projected/33ae6938-dd8e-4224-9522-a9b8638c8c54-kube-api-access-lcjq5\") pod \"33ae6938-dd8e-4224-9522-a9b8638c8c54\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.332243 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-ovsdbserver-sb\") pod \"33ae6938-dd8e-4224-9522-a9b8638c8c54\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.332297 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-ovsdbserver-nb\") pod \"33ae6938-dd8e-4224-9522-a9b8638c8c54\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.332400 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-config\") pod \"33ae6938-dd8e-4224-9522-a9b8638c8c54\" (UID: \"33ae6938-dd8e-4224-9522-a9b8638c8c54\") " Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.342404 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ae6938-dd8e-4224-9522-a9b8638c8c54-kube-api-access-lcjq5" (OuterVolumeSpecName: "kube-api-access-lcjq5") pod "33ae6938-dd8e-4224-9522-a9b8638c8c54" (UID: "33ae6938-dd8e-4224-9522-a9b8638c8c54"). InnerVolumeSpecName "kube-api-access-lcjq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.435270 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcjq5\" (UniqueName: \"kubernetes.io/projected/33ae6938-dd8e-4224-9522-a9b8638c8c54-kube-api-access-lcjq5\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.459913 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33ae6938-dd8e-4224-9522-a9b8638c8c54" (UID: "33ae6938-dd8e-4224-9522-a9b8638c8c54"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.477866 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33ae6938-dd8e-4224-9522-a9b8638c8c54" (UID: "33ae6938-dd8e-4224-9522-a9b8638c8c54"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.521355 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33ae6938-dd8e-4224-9522-a9b8638c8c54" (UID: "33ae6938-dd8e-4224-9522-a9b8638c8c54"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.531580 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-config" (OuterVolumeSpecName: "config") pod "33ae6938-dd8e-4224-9522-a9b8638c8c54" (UID: "33ae6938-dd8e-4224-9522-a9b8638c8c54"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.541248 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.541284 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.541299 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.541309 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33ae6938-dd8e-4224-9522-a9b8638c8c54-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.721017 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-5jcjx"] Mar 09 09:31:08 crc kubenswrapper[4792]: W0309 09:31:08.735220 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde676155_c813_4600_8042_5434607f00ca.slice/crio-f4d91a22bd6bedddb254e3c826ac6b571669012cbda350d2269756d3746eca46 WatchSource:0}: Error finding container f4d91a22bd6bedddb254e3c826ac6b571669012cbda350d2269756d3746eca46: Status 404 returned error can't find the container with id f4d91a22bd6bedddb254e3c826ac6b571669012cbda350d2269756d3746eca46 Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.765008 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74598c69-6shrv" event={"ID":"33ae6938-dd8e-4224-9522-a9b8638c8c54","Type":"ContainerDied","Data":"32dc4c2ea711b8a732a84a026bd3993ba22ae1779d4ac43055a69810990c1b31"} Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.765064 4792 scope.go:117] "RemoveContainer" containerID="e42d0097053816c2b84291655850053d2d6ec8c2668daf65eac099034ab2d71f" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.765247 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74598c69-6shrv" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.770912 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-5jcjx" event={"ID":"de676155-c813-4600-8042-5434607f00ca","Type":"ContainerStarted","Data":"f4d91a22bd6bedddb254e3c826ac6b571669012cbda350d2269756d3746eca46"} Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.914960 4792 scope.go:117] "RemoveContainer" containerID="ea45b9054f0415dd359b4b57cdaf77611c1469f7bbd9082b8447686a15720167" Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.930518 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-6shrv"] Mar 09 09:31:08 crc kubenswrapper[4792]: I0309 09:31:08.948333 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c74598c69-6shrv"] Mar 09 09:31:09 crc kubenswrapper[4792]: I0309 09:31:09.674157 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ae6938-dd8e-4224-9522-a9b8638c8c54" path="/var/lib/kubelet/pods/33ae6938-dd8e-4224-9522-a9b8638c8c54/volumes" Mar 09 09:31:09 crc kubenswrapper[4792]: I0309 09:31:09.781987 4792 generic.go:334] "Generic (PLEG): container finished" podID="de676155-c813-4600-8042-5434607f00ca" containerID="2f4574d13e7179836e61569ee9d1a7181cb77d1f328d748bdfcf7533c9bd7e7b" exitCode=0 Mar 09 09:31:09 crc kubenswrapper[4792]: I0309 09:31:09.782085 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-5jcjx" event={"ID":"de676155-c813-4600-8042-5434607f00ca","Type":"ContainerDied","Data":"2f4574d13e7179836e61569ee9d1a7181cb77d1f328d748bdfcf7533c9bd7e7b"} Mar 09 09:31:10 crc kubenswrapper[4792]: I0309 09:31:10.826179 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-5jcjx" event={"ID":"de676155-c813-4600-8042-5434607f00ca","Type":"ContainerStarted","Data":"8049d4eea6b314205e7cf5baaa8bdb7bb7b5c033661c855a3fdfed6a78772f9b"} Mar 09 09:31:10 crc kubenswrapper[4792]: I0309 09:31:10.828279 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:10 crc kubenswrapper[4792]: I0309 09:31:10.856281 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb7494899-5jcjx" podStartSLOduration=3.856250792 podStartE2EDuration="3.856250792s" podCreationTimestamp="2026-03-09 09:31:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:31:10.849350037 +0000 UTC m=+1435.879550789" watchObservedRunningTime="2026-03-09 09:31:10.856250792 +0000 UTC m=+1435.886451564" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.119311 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.182267 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-7cg56"] Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.182699 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59c44489bc-7cg56" podUID="ea361156-3381-4511-8a09-ec3daaee9ef0" containerName="dnsmasq-dns" containerID="cri-o://268847f67fe50a21d526607f1c1c0cc48a878d0b3f25ba340b544abb8fb0b209" gracePeriod=10 Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.731645 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.854975 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-config\") pod \"ea361156-3381-4511-8a09-ec3daaee9ef0\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.855053 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2wzg\" (UniqueName: \"kubernetes.io/projected/ea361156-3381-4511-8a09-ec3daaee9ef0-kube-api-access-s2wzg\") pod \"ea361156-3381-4511-8a09-ec3daaee9ef0\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.855142 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-dns-svc\") pod \"ea361156-3381-4511-8a09-ec3daaee9ef0\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.855249 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-ovsdbserver-sb\") pod \"ea361156-3381-4511-8a09-ec3daaee9ef0\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.855291 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-ovsdbserver-nb\") pod \"ea361156-3381-4511-8a09-ec3daaee9ef0\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.855310 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-openstack-edpm-ipam\") pod \"ea361156-3381-4511-8a09-ec3daaee9ef0\" (UID: \"ea361156-3381-4511-8a09-ec3daaee9ef0\") " Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.861024 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea361156-3381-4511-8a09-ec3daaee9ef0-kube-api-access-s2wzg" (OuterVolumeSpecName: "kube-api-access-s2wzg") pod "ea361156-3381-4511-8a09-ec3daaee9ef0" (UID: "ea361156-3381-4511-8a09-ec3daaee9ef0"). InnerVolumeSpecName "kube-api-access-s2wzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.905049 4792 generic.go:334] "Generic (PLEG): container finished" podID="ea361156-3381-4511-8a09-ec3daaee9ef0" containerID="268847f67fe50a21d526607f1c1c0cc48a878d0b3f25ba340b544abb8fb0b209" exitCode=0 Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.905253 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c44489bc-7cg56" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.905313 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-7cg56" event={"ID":"ea361156-3381-4511-8a09-ec3daaee9ef0","Type":"ContainerDied","Data":"268847f67fe50a21d526607f1c1c0cc48a878d0b3f25ba340b544abb8fb0b209"} Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.907680 4792 scope.go:117] "RemoveContainer" containerID="268847f67fe50a21d526607f1c1c0cc48a878d0b3f25ba340b544abb8fb0b209" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.907537 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c44489bc-7cg56" event={"ID":"ea361156-3381-4511-8a09-ec3daaee9ef0","Type":"ContainerDied","Data":"e5beb73756165c0713408a28064146d444246eabdfcc6a4431f42f5823470b53"} Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.935731 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea361156-3381-4511-8a09-ec3daaee9ef0" (UID: "ea361156-3381-4511-8a09-ec3daaee9ef0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.937847 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ea361156-3381-4511-8a09-ec3daaee9ef0" (UID: "ea361156-3381-4511-8a09-ec3daaee9ef0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.948905 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-config" (OuterVolumeSpecName: "config") pod "ea361156-3381-4511-8a09-ec3daaee9ef0" (UID: "ea361156-3381-4511-8a09-ec3daaee9ef0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.954290 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ea361156-3381-4511-8a09-ec3daaee9ef0" (UID: "ea361156-3381-4511-8a09-ec3daaee9ef0"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.957930 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.957988 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.958002 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.958012 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.958021 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2wzg\" (UniqueName: \"kubernetes.io/projected/ea361156-3381-4511-8a09-ec3daaee9ef0-kube-api-access-s2wzg\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.963575 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ea361156-3381-4511-8a09-ec3daaee9ef0" (UID: "ea361156-3381-4511-8a09-ec3daaee9ef0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.975040 4792 scope.go:117] "RemoveContainer" containerID="aaec01a90e20a677a79d3063eb607ff4e000cc9e9eccf7cde01d76d6b321a3b1" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.997459 4792 scope.go:117] "RemoveContainer" containerID="268847f67fe50a21d526607f1c1c0cc48a878d0b3f25ba340b544abb8fb0b209" Mar 09 09:31:18 crc kubenswrapper[4792]: E0309 09:31:18.998146 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268847f67fe50a21d526607f1c1c0cc48a878d0b3f25ba340b544abb8fb0b209\": container with ID starting with 268847f67fe50a21d526607f1c1c0cc48a878d0b3f25ba340b544abb8fb0b209 not found: ID does not exist" containerID="268847f67fe50a21d526607f1c1c0cc48a878d0b3f25ba340b544abb8fb0b209" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.998280 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268847f67fe50a21d526607f1c1c0cc48a878d0b3f25ba340b544abb8fb0b209"} err="failed to get container status \"268847f67fe50a21d526607f1c1c0cc48a878d0b3f25ba340b544abb8fb0b209\": rpc error: code = NotFound desc = could not find container \"268847f67fe50a21d526607f1c1c0cc48a878d0b3f25ba340b544abb8fb0b209\": container with ID starting with 268847f67fe50a21d526607f1c1c0cc48a878d0b3f25ba340b544abb8fb0b209 not found: ID does not exist" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.998396 4792 scope.go:117] "RemoveContainer" containerID="aaec01a90e20a677a79d3063eb607ff4e000cc9e9eccf7cde01d76d6b321a3b1" Mar 09 09:31:18 crc kubenswrapper[4792]: E0309 09:31:18.998790 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaec01a90e20a677a79d3063eb607ff4e000cc9e9eccf7cde01d76d6b321a3b1\": container with ID starting with aaec01a90e20a677a79d3063eb607ff4e000cc9e9eccf7cde01d76d6b321a3b1 not found: ID does not exist" containerID="aaec01a90e20a677a79d3063eb607ff4e000cc9e9eccf7cde01d76d6b321a3b1" Mar 09 09:31:18 crc kubenswrapper[4792]: I0309 09:31:18.998813 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaec01a90e20a677a79d3063eb607ff4e000cc9e9eccf7cde01d76d6b321a3b1"} err="failed to get container status \"aaec01a90e20a677a79d3063eb607ff4e000cc9e9eccf7cde01d76d6b321a3b1\": rpc error: code = NotFound desc = could not find container \"aaec01a90e20a677a79d3063eb607ff4e000cc9e9eccf7cde01d76d6b321a3b1\": container with ID starting with aaec01a90e20a677a79d3063eb607ff4e000cc9e9eccf7cde01d76d6b321a3b1 not found: ID does not exist" Mar 09 09:31:19 crc kubenswrapper[4792]: I0309 09:31:19.060423 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea361156-3381-4511-8a09-ec3daaee9ef0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:19 crc kubenswrapper[4792]: I0309 09:31:19.245362 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-7cg56"] Mar 09 09:31:19 crc kubenswrapper[4792]: I0309 09:31:19.255640 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59c44489bc-7cg56"] Mar 09 09:31:19 crc kubenswrapper[4792]: I0309 09:31:19.679147 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea361156-3381-4511-8a09-ec3daaee9ef0" path="/var/lib/kubelet/pods/ea361156-3381-4511-8a09-ec3daaee9ef0/volumes" Mar 09 09:31:23 crc kubenswrapper[4792]: I0309 09:31:23.917723 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v"] Mar 09 09:31:23 crc kubenswrapper[4792]: E0309 09:31:23.918680 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea361156-3381-4511-8a09-ec3daaee9ef0" containerName="init" Mar 09 09:31:23 crc kubenswrapper[4792]: I0309 09:31:23.918697 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea361156-3381-4511-8a09-ec3daaee9ef0" containerName="init" Mar 09 09:31:23 crc kubenswrapper[4792]: E0309 09:31:23.918733 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ae6938-dd8e-4224-9522-a9b8638c8c54" containerName="init" Mar 09 09:31:23 crc kubenswrapper[4792]: I0309 09:31:23.918743 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ae6938-dd8e-4224-9522-a9b8638c8c54" containerName="init" Mar 09 09:31:23 crc kubenswrapper[4792]: E0309 09:31:23.918755 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea361156-3381-4511-8a09-ec3daaee9ef0" containerName="dnsmasq-dns" Mar 09 09:31:23 crc kubenswrapper[4792]: I0309 09:31:23.918762 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea361156-3381-4511-8a09-ec3daaee9ef0" containerName="dnsmasq-dns" Mar 09 09:31:23 crc kubenswrapper[4792]: E0309 09:31:23.918777 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ae6938-dd8e-4224-9522-a9b8638c8c54" containerName="dnsmasq-dns" Mar 09 09:31:23 crc kubenswrapper[4792]: I0309 09:31:23.918784 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ae6938-dd8e-4224-9522-a9b8638c8c54" containerName="dnsmasq-dns" Mar 09 09:31:23 crc kubenswrapper[4792]: I0309 09:31:23.918994 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea361156-3381-4511-8a09-ec3daaee9ef0" containerName="dnsmasq-dns" Mar 09 09:31:23 crc kubenswrapper[4792]: I0309 09:31:23.919020 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ae6938-dd8e-4224-9522-a9b8638c8c54" containerName="dnsmasq-dns" Mar 09 09:31:23 crc kubenswrapper[4792]: I0309 09:31:23.919758 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:23 crc kubenswrapper[4792]: I0309 09:31:23.923726 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:31:23 crc kubenswrapper[4792]: I0309 09:31:23.924026 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:31:23 crc kubenswrapper[4792]: I0309 09:31:23.926434 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:31:23 crc kubenswrapper[4792]: I0309 09:31:23.926727 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:31:23 crc kubenswrapper[4792]: I0309 09:31:23.948094 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v"] Mar 09 09:31:24 crc kubenswrapper[4792]: I0309 09:31:24.095720 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:24 crc kubenswrapper[4792]: I0309 09:31:24.095805 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:24 crc kubenswrapper[4792]: I0309 09:31:24.096027 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:24 crc kubenswrapper[4792]: I0309 09:31:24.096130 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974tc\" (UniqueName: \"kubernetes.io/projected/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-kube-api-access-974tc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:24 crc kubenswrapper[4792]: I0309 09:31:24.197521 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:24 crc kubenswrapper[4792]: I0309 09:31:24.197580 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:24 crc kubenswrapper[4792]: I0309 09:31:24.197650 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-974tc\" (UniqueName: \"kubernetes.io/projected/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-kube-api-access-974tc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:24 crc kubenswrapper[4792]: I0309 09:31:24.197739 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:24 crc kubenswrapper[4792]: I0309 09:31:24.205156 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:24 crc kubenswrapper[4792]: I0309 09:31:24.205865 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:24 crc kubenswrapper[4792]: I0309 09:31:24.211411 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:24 crc kubenswrapper[4792]: I0309 09:31:24.220877 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-974tc\" (UniqueName: \"kubernetes.io/projected/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-kube-api-access-974tc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:24 crc kubenswrapper[4792]: I0309 09:31:24.261346 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:25 crc kubenswrapper[4792]: I0309 09:31:25.004356 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v"] Mar 09 09:31:25 crc kubenswrapper[4792]: W0309 09:31:25.004691 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c47d4d4_f315_4f19_8d62_dbc17c64a39a.slice/crio-f588375048b1983e8fb342e86faa5cc85d581a7ac37f8db81a97970da7e1c838 WatchSource:0}: Error finding container f588375048b1983e8fb342e86faa5cc85d581a7ac37f8db81a97970da7e1c838: Status 404 returned error can't find the container with id f588375048b1983e8fb342e86faa5cc85d581a7ac37f8db81a97970da7e1c838 Mar 09 09:31:25 crc kubenswrapper[4792]: I0309 09:31:25.982631 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" event={"ID":"1c47d4d4-f315-4f19-8d62-dbc17c64a39a","Type":"ContainerStarted","Data":"f588375048b1983e8fb342e86faa5cc85d581a7ac37f8db81a97970da7e1c838"} Mar 09 09:31:31 crc kubenswrapper[4792]: I0309 09:31:31.025747 4792 generic.go:334] "Generic (PLEG): container finished" podID="6a994be4-9a88-4ee6-8e24-a6d62898f593" containerID="671fa139dd2ef4dac72fada3f77ac5db1589432c9b11c10e26e683187e9e6409" exitCode=0 Mar 09 09:31:31 crc kubenswrapper[4792]: I0309 09:31:31.025963 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a994be4-9a88-4ee6-8e24-a6d62898f593","Type":"ContainerDied","Data":"671fa139dd2ef4dac72fada3f77ac5db1589432c9b11c10e26e683187e9e6409"} Mar 09 09:31:32 crc kubenswrapper[4792]: I0309 09:31:32.070260 4792 generic.go:334] "Generic (PLEG): container finished" podID="a20da79f-1b2b-4d52-bf44-4c6a9bf0f210" containerID="0b0fb83ecd6aec03bfa921c2b3f3e25aa4acf27ecf6d6eff6752d36b011cfbf8" exitCode=0 Mar 09 09:31:32 crc kubenswrapper[4792]: I0309 09:31:32.070315 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210","Type":"ContainerDied","Data":"0b0fb83ecd6aec03bfa921c2b3f3e25aa4acf27ecf6d6eff6752d36b011cfbf8"} Mar 09 09:31:37 crc kubenswrapper[4792]: I0309 09:31:37.117458 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6a994be4-9a88-4ee6-8e24-a6d62898f593","Type":"ContainerStarted","Data":"3f0612d9b7d837be99697ad30230b33255f7960ffa28a2ba78507e3d414ceb44"} Mar 09 09:31:37 crc kubenswrapper[4792]: I0309 09:31:37.118295 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 09:31:37 crc kubenswrapper[4792]: I0309 09:31:37.119598 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a20da79f-1b2b-4d52-bf44-4c6a9bf0f210","Type":"ContainerStarted","Data":"03d14a33f09ae6c9c189413b0f1c9ccced90604241c6c1418efb666c940cb8f8"} Mar 09 09:31:37 crc kubenswrapper[4792]: I0309 09:31:37.119809 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:31:37 crc kubenswrapper[4792]: I0309 09:31:37.121048 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" event={"ID":"1c47d4d4-f315-4f19-8d62-dbc17c64a39a","Type":"ContainerStarted","Data":"283b5025e57ea97c4dd7b984e88288fbfd73dbe246ca3559abdea0175d56c3b4"} Mar 09 09:31:37 crc kubenswrapper[4792]: I0309 09:31:37.148692 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.148672513 podStartE2EDuration="42.148672513s" podCreationTimestamp="2026-03-09 09:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:31:37.142426938 +0000 UTC m=+1462.172627690" watchObservedRunningTime="2026-03-09 09:31:37.148672513 +0000 UTC m=+1462.178873265" Mar 09 09:31:37 crc kubenswrapper[4792]: I0309 09:31:37.193810 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.193792655 podStartE2EDuration="41.193792655s" podCreationTimestamp="2026-03-09 09:30:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:31:37.189515547 +0000 UTC m=+1462.219716299" watchObservedRunningTime="2026-03-09 09:31:37.193792655 +0000 UTC m=+1462.223993407" Mar 09 09:31:37 crc kubenswrapper[4792]: I0309 09:31:37.213209 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" podStartSLOduration=2.727984077 podStartE2EDuration="14.213185211s" podCreationTimestamp="2026-03-09 09:31:23 +0000 UTC" firstStartedPulling="2026-03-09 09:31:25.008836211 +0000 UTC m=+1450.039036963" lastFinishedPulling="2026-03-09 09:31:36.494037345 +0000 UTC m=+1461.524238097" observedRunningTime="2026-03-09 09:31:37.21143665 +0000 UTC m=+1462.241637412" watchObservedRunningTime="2026-03-09 09:31:37.213185211 +0000 UTC m=+1462.243385963" Mar 09 09:31:46 crc kubenswrapper[4792]: I0309 09:31:46.099440 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6a994be4-9a88-4ee6-8e24-a6d62898f593" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.198:5671: connect: connection refused" Mar 09 09:31:47 crc kubenswrapper[4792]: I0309 09:31:47.226733 4792 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a20da79f-1b2b-4d52-bf44-4c6a9bf0f210" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.199:5671: connect: connection refused" Mar 09 09:31:53 crc kubenswrapper[4792]: I0309 09:31:53.251877 4792 generic.go:334] "Generic (PLEG): container finished" podID="1c47d4d4-f315-4f19-8d62-dbc17c64a39a" containerID="283b5025e57ea97c4dd7b984e88288fbfd73dbe246ca3559abdea0175d56c3b4" exitCode=0 Mar 09 09:31:53 crc kubenswrapper[4792]: I0309 09:31:53.253942 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" event={"ID":"1c47d4d4-f315-4f19-8d62-dbc17c64a39a","Type":"ContainerDied","Data":"283b5025e57ea97c4dd7b984e88288fbfd73dbe246ca3559abdea0175d56c3b4"} Mar 09 09:31:54 crc kubenswrapper[4792]: I0309 09:31:54.679134 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:54 crc kubenswrapper[4792]: I0309 09:31:54.810222 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-repo-setup-combined-ca-bundle\") pod \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " Mar 09 09:31:54 crc kubenswrapper[4792]: I0309 09:31:54.810304 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-inventory\") pod \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " Mar 09 09:31:54 crc kubenswrapper[4792]: I0309 09:31:54.810426 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-974tc\" (UniqueName: \"kubernetes.io/projected/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-kube-api-access-974tc\") pod \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " Mar 09 09:31:54 crc kubenswrapper[4792]: I0309 09:31:54.810514 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-ssh-key-openstack-edpm-ipam\") pod \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\" (UID: \"1c47d4d4-f315-4f19-8d62-dbc17c64a39a\") " Mar 09 09:31:54 crc kubenswrapper[4792]: I0309 09:31:54.816022 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-kube-api-access-974tc" (OuterVolumeSpecName: "kube-api-access-974tc") pod "1c47d4d4-f315-4f19-8d62-dbc17c64a39a" (UID: "1c47d4d4-f315-4f19-8d62-dbc17c64a39a"). InnerVolumeSpecName "kube-api-access-974tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:31:54 crc kubenswrapper[4792]: I0309 09:31:54.822352 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1c47d4d4-f315-4f19-8d62-dbc17c64a39a" (UID: "1c47d4d4-f315-4f19-8d62-dbc17c64a39a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:31:54 crc kubenswrapper[4792]: I0309 09:31:54.839995 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1c47d4d4-f315-4f19-8d62-dbc17c64a39a" (UID: "1c47d4d4-f315-4f19-8d62-dbc17c64a39a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:31:54 crc kubenswrapper[4792]: I0309 09:31:54.840882 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-inventory" (OuterVolumeSpecName: "inventory") pod "1c47d4d4-f315-4f19-8d62-dbc17c64a39a" (UID: "1c47d4d4-f315-4f19-8d62-dbc17c64a39a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:31:54 crc kubenswrapper[4792]: I0309 09:31:54.912628 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-974tc\" (UniqueName: \"kubernetes.io/projected/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-kube-api-access-974tc\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:54 crc kubenswrapper[4792]: I0309 09:31:54.912674 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:54 crc kubenswrapper[4792]: I0309 09:31:54.912690 4792 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:54 crc kubenswrapper[4792]: I0309 09:31:54.912703 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c47d4d4-f315-4f19-8d62-dbc17c64a39a-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.271677 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" event={"ID":"1c47d4d4-f315-4f19-8d62-dbc17c64a39a","Type":"ContainerDied","Data":"f588375048b1983e8fb342e86faa5cc85d581a7ac37f8db81a97970da7e1c838"} Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.271729 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f588375048b1983e8fb342e86faa5cc85d581a7ac37f8db81a97970da7e1c838" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.271741 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.366147 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr"] Mar 09 09:31:55 crc kubenswrapper[4792]: E0309 09:31:55.367008 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c47d4d4-f315-4f19-8d62-dbc17c64a39a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.367028 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c47d4d4-f315-4f19-8d62-dbc17c64a39a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.367265 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c47d4d4-f315-4f19-8d62-dbc17c64a39a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.368006 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.375508 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.381262 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.381436 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.381558 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.391724 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr"] Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.420015 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.420121 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.420142 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.420280 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7962k\" (UniqueName: \"kubernetes.io/projected/2acce3b0-ccfb-48f6-af71-ecaa5b820874-kube-api-access-7962k\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.522034 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7962k\" (UniqueName: \"kubernetes.io/projected/2acce3b0-ccfb-48f6-af71-ecaa5b820874-kube-api-access-7962k\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.522126 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.522205 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.522226 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.528147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.528846 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.539303 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.541634 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7962k\" (UniqueName: \"kubernetes.io/projected/2acce3b0-ccfb-48f6-af71-ecaa5b820874-kube-api-access-7962k\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:55 crc kubenswrapper[4792]: I0309 09:31:55.696335 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:31:56 crc kubenswrapper[4792]: I0309 09:31:56.099108 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 09:31:56 crc kubenswrapper[4792]: I0309 09:31:56.336046 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr"] Mar 09 09:31:56 crc kubenswrapper[4792]: W0309 09:31:56.346427 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2acce3b0_ccfb_48f6_af71_ecaa5b820874.slice/crio-9c3a5a2e0857c9a8941dd2d2b3310abc6c30930595487c2f4898367259decb2f WatchSource:0}: Error finding container 9c3a5a2e0857c9a8941dd2d2b3310abc6c30930595487c2f4898367259decb2f: Status 404 returned error can't find the container with id 9c3a5a2e0857c9a8941dd2d2b3310abc6c30930595487c2f4898367259decb2f Mar 09 09:31:56 crc kubenswrapper[4792]: I0309 09:31:56.353079 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:31:57 crc kubenswrapper[4792]: I0309 09:31:57.227289 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:31:57 crc kubenswrapper[4792]: I0309 09:31:57.291748 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" event={"ID":"2acce3b0-ccfb-48f6-af71-ecaa5b820874","Type":"ContainerStarted","Data":"0d2ba3a4a18a680ca58f3850cfa2b34550efc1260f9d446afb07dca113766b9b"} Mar 09 09:31:57 crc kubenswrapper[4792]: I0309 09:31:57.291797 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" event={"ID":"2acce3b0-ccfb-48f6-af71-ecaa5b820874","Type":"ContainerStarted","Data":"9c3a5a2e0857c9a8941dd2d2b3310abc6c30930595487c2f4898367259decb2f"} Mar 09 09:31:57 crc kubenswrapper[4792]: I0309 09:31:57.333230 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" podStartSLOduration=1.765478407 podStartE2EDuration="2.33321176s" podCreationTimestamp="2026-03-09 09:31:55 +0000 UTC" firstStartedPulling="2026-03-09 09:31:56.352854075 +0000 UTC m=+1481.383054827" lastFinishedPulling="2026-03-09 09:31:56.920587428 +0000 UTC m=+1481.950788180" observedRunningTime="2026-03-09 09:31:57.325964214 +0000 UTC m=+1482.356164966" watchObservedRunningTime="2026-03-09 09:31:57.33321176 +0000 UTC m=+1482.363412512" Mar 09 09:32:00 crc kubenswrapper[4792]: I0309 09:32:00.131966 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550812-9npxv"] Mar 09 09:32:00 crc kubenswrapper[4792]: I0309 09:32:00.133476 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550812-9npxv" Mar 09 09:32:00 crc kubenswrapper[4792]: I0309 09:32:00.136297 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:32:00 crc kubenswrapper[4792]: I0309 09:32:00.136852 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:32:00 crc kubenswrapper[4792]: I0309 09:32:00.138390 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:32:00 crc kubenswrapper[4792]: I0309 09:32:00.155896 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550812-9npxv"] Mar 09 09:32:00 crc kubenswrapper[4792]: I0309 09:32:00.220021 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phxlc\" (UniqueName: \"kubernetes.io/projected/38252055-92e3-4cf5-8f1a-78ad77a78f62-kube-api-access-phxlc\") pod \"auto-csr-approver-29550812-9npxv\" (UID: \"38252055-92e3-4cf5-8f1a-78ad77a78f62\") " pod="openshift-infra/auto-csr-approver-29550812-9npxv" Mar 09 09:32:00 crc kubenswrapper[4792]: I0309 09:32:00.322054 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phxlc\" (UniqueName: \"kubernetes.io/projected/38252055-92e3-4cf5-8f1a-78ad77a78f62-kube-api-access-phxlc\") pod \"auto-csr-approver-29550812-9npxv\" (UID: \"38252055-92e3-4cf5-8f1a-78ad77a78f62\") " pod="openshift-infra/auto-csr-approver-29550812-9npxv" Mar 09 09:32:00 crc kubenswrapper[4792]: I0309 09:32:00.346826 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phxlc\" (UniqueName: \"kubernetes.io/projected/38252055-92e3-4cf5-8f1a-78ad77a78f62-kube-api-access-phxlc\") pod \"auto-csr-approver-29550812-9npxv\" (UID: \"38252055-92e3-4cf5-8f1a-78ad77a78f62\") " pod="openshift-infra/auto-csr-approver-29550812-9npxv" Mar 09 09:32:00 crc kubenswrapper[4792]: I0309 09:32:00.456375 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550812-9npxv" Mar 09 09:32:00 crc kubenswrapper[4792]: I0309 09:32:00.901467 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550812-9npxv"] Mar 09 09:32:01 crc kubenswrapper[4792]: I0309 09:32:01.323841 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550812-9npxv" event={"ID":"38252055-92e3-4cf5-8f1a-78ad77a78f62","Type":"ContainerStarted","Data":"0c0386728b2632a92842ce41ddfa0338cb086a4cb7f8206cb666769e0e342554"} Mar 09 09:32:02 crc kubenswrapper[4792]: I0309 09:32:02.337780 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550812-9npxv" event={"ID":"38252055-92e3-4cf5-8f1a-78ad77a78f62","Type":"ContainerStarted","Data":"4a87b2d0dd2216cb73b6d522b398544e2869d69b53f4b248198f871f2cfdfd8e"} Mar 09 09:32:02 crc kubenswrapper[4792]: I0309 09:32:02.363903 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550812-9npxv" podStartSLOduration=1.466029148 podStartE2EDuration="2.363886579s" podCreationTimestamp="2026-03-09 09:32:00 +0000 UTC" firstStartedPulling="2026-03-09 09:32:00.912786585 +0000 UTC m=+1485.942987337" lastFinishedPulling="2026-03-09 09:32:01.810644016 +0000 UTC m=+1486.840844768" observedRunningTime="2026-03-09 09:32:02.35381217 +0000 UTC m=+1487.384012932" watchObservedRunningTime="2026-03-09 09:32:02.363886579 +0000 UTC m=+1487.394087331" Mar 09 09:32:05 crc kubenswrapper[4792]: I0309 09:32:05.364035 4792 generic.go:334] "Generic (PLEG): container finished" podID="38252055-92e3-4cf5-8f1a-78ad77a78f62" containerID="4a87b2d0dd2216cb73b6d522b398544e2869d69b53f4b248198f871f2cfdfd8e" exitCode=0 Mar 09 09:32:05 crc kubenswrapper[4792]: I0309 09:32:05.364169 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550812-9npxv" event={"ID":"38252055-92e3-4cf5-8f1a-78ad77a78f62","Type":"ContainerDied","Data":"4a87b2d0dd2216cb73b6d522b398544e2869d69b53f4b248198f871f2cfdfd8e"} Mar 09 09:32:07 crc kubenswrapper[4792]: I0309 09:32:07.144003 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550812-9npxv" Mar 09 09:32:07 crc kubenswrapper[4792]: I0309 09:32:07.308361 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phxlc\" (UniqueName: \"kubernetes.io/projected/38252055-92e3-4cf5-8f1a-78ad77a78f62-kube-api-access-phxlc\") pod \"38252055-92e3-4cf5-8f1a-78ad77a78f62\" (UID: \"38252055-92e3-4cf5-8f1a-78ad77a78f62\") " Mar 09 09:32:07 crc kubenswrapper[4792]: I0309 09:32:07.319891 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38252055-92e3-4cf5-8f1a-78ad77a78f62-kube-api-access-phxlc" (OuterVolumeSpecName: "kube-api-access-phxlc") pod "38252055-92e3-4cf5-8f1a-78ad77a78f62" (UID: "38252055-92e3-4cf5-8f1a-78ad77a78f62"). InnerVolumeSpecName "kube-api-access-phxlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:32:07 crc kubenswrapper[4792]: I0309 09:32:07.384195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550812-9npxv" event={"ID":"38252055-92e3-4cf5-8f1a-78ad77a78f62","Type":"ContainerDied","Data":"0c0386728b2632a92842ce41ddfa0338cb086a4cb7f8206cb666769e0e342554"} Mar 09 09:32:07 crc kubenswrapper[4792]: I0309 09:32:07.384233 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c0386728b2632a92842ce41ddfa0338cb086a4cb7f8206cb666769e0e342554" Mar 09 09:32:07 crc kubenswrapper[4792]: I0309 09:32:07.384291 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550812-9npxv" Mar 09 09:32:07 crc kubenswrapper[4792]: I0309 09:32:07.410416 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phxlc\" (UniqueName: \"kubernetes.io/projected/38252055-92e3-4cf5-8f1a-78ad77a78f62-kube-api-access-phxlc\") on node \"crc\" DevicePath \"\"" Mar 09 09:32:07 crc kubenswrapper[4792]: I0309 09:32:07.932429 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550806-qqlhx"] Mar 09 09:32:07 crc kubenswrapper[4792]: I0309 09:32:07.940549 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550806-qqlhx"] Mar 09 09:32:09 crc kubenswrapper[4792]: I0309 09:32:09.672035 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dcbe3d4-bd2e-4a4b-864a-77b49c015354" path="/var/lib/kubelet/pods/9dcbe3d4-bd2e-4a4b-864a-77b49c015354/volumes" Mar 09 09:32:36 crc kubenswrapper[4792]: I0309 09:32:36.392668 4792 scope.go:117] "RemoveContainer" containerID="e96c99b444d6d824d7e4089cdf64b274e71d120f365bbbd16da758c9854e606d" Mar 09 09:32:36 crc kubenswrapper[4792]: I0309 09:32:36.422432 4792 scope.go:117] "RemoveContainer" containerID="1e30ee26135f2ea4c44b7eccafc3705bb1758ba9a51c12a37263b7f72efd60c8" Mar 09 09:32:36 crc kubenswrapper[4792]: I0309 09:32:36.475210 4792 scope.go:117] "RemoveContainer" containerID="62d10d444eeac48b6552bf53278c005981de8c2ea7b744b584cf6191830ab056" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.193431 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vqjd2"] Mar 09 09:32:39 crc kubenswrapper[4792]: E0309 09:32:39.194208 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38252055-92e3-4cf5-8f1a-78ad77a78f62" containerName="oc" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.194225 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="38252055-92e3-4cf5-8f1a-78ad77a78f62" containerName="oc" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.194411 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="38252055-92e3-4cf5-8f1a-78ad77a78f62" containerName="oc" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.196089 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.211346 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqjd2"] Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.243321 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fctbk\" (UniqueName: \"kubernetes.io/projected/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-kube-api-access-fctbk\") pod \"certified-operators-vqjd2\" (UID: \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\") " pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.243444 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-utilities\") pod \"certified-operators-vqjd2\" (UID: \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\") " pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.243486 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-catalog-content\") pod \"certified-operators-vqjd2\" (UID: \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\") " pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.345537 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-utilities\") pod \"certified-operators-vqjd2\" (UID: \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\") " pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.345615 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-catalog-content\") pod \"certified-operators-vqjd2\" (UID: \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\") " pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.345646 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fctbk\" (UniqueName: \"kubernetes.io/projected/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-kube-api-access-fctbk\") pod \"certified-operators-vqjd2\" (UID: \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\") " pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.346449 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-utilities\") pod \"certified-operators-vqjd2\" (UID: \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\") " pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.346519 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-catalog-content\") pod \"certified-operators-vqjd2\" (UID: \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\") " pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.366631 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fctbk\" (UniqueName: \"kubernetes.io/projected/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-kube-api-access-fctbk\") pod \"certified-operators-vqjd2\" (UID: \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\") " pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:39 crc kubenswrapper[4792]: I0309 09:32:39.513984 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:40 crc kubenswrapper[4792]: I0309 09:32:40.035055 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqjd2"] Mar 09 09:32:40 crc kubenswrapper[4792]: I0309 09:32:40.664951 4792 generic.go:334] "Generic (PLEG): container finished" podID="2ae84b32-8a77-45af-a24b-fdc6753b2d0c" containerID="00f5108a0e559419812533ea641ea30a1386080cf275437fc164e4421baee5e3" exitCode=0 Mar 09 09:32:40 crc kubenswrapper[4792]: I0309 09:32:40.665442 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqjd2" event={"ID":"2ae84b32-8a77-45af-a24b-fdc6753b2d0c","Type":"ContainerDied","Data":"00f5108a0e559419812533ea641ea30a1386080cf275437fc164e4421baee5e3"} Mar 09 09:32:40 crc kubenswrapper[4792]: I0309 09:32:40.665474 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqjd2" event={"ID":"2ae84b32-8a77-45af-a24b-fdc6753b2d0c","Type":"ContainerStarted","Data":"62d559b7fc9b0ff2fe4e387b9ff1e224a93f6135eb7aabf6498448cc3e952087"} Mar 09 09:32:41 crc kubenswrapper[4792]: I0309 09:32:41.679237 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqjd2" event={"ID":"2ae84b32-8a77-45af-a24b-fdc6753b2d0c","Type":"ContainerStarted","Data":"f8e637349f4bcf9a6155ec93526ae6f11e4e5d33c369e3b047179623e7a0bb34"} Mar 09 09:32:43 crc kubenswrapper[4792]: I0309 09:32:43.213757 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:32:43 crc kubenswrapper[4792]: I0309 09:32:43.213810 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:32:45 crc kubenswrapper[4792]: I0309 09:32:45.716158 4792 generic.go:334] "Generic (PLEG): container finished" podID="2ae84b32-8a77-45af-a24b-fdc6753b2d0c" containerID="f8e637349f4bcf9a6155ec93526ae6f11e4e5d33c369e3b047179623e7a0bb34" exitCode=0 Mar 09 09:32:45 crc kubenswrapper[4792]: I0309 09:32:45.716211 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqjd2" event={"ID":"2ae84b32-8a77-45af-a24b-fdc6753b2d0c","Type":"ContainerDied","Data":"f8e637349f4bcf9a6155ec93526ae6f11e4e5d33c369e3b047179623e7a0bb34"} Mar 09 09:32:46 crc kubenswrapper[4792]: I0309 09:32:46.725812 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqjd2" event={"ID":"2ae84b32-8a77-45af-a24b-fdc6753b2d0c","Type":"ContainerStarted","Data":"f222caf0b8697f8832f3d002f9bd44772d323ece4323a89128f5a263c3564883"} Mar 09 09:32:46 crc kubenswrapper[4792]: I0309 09:32:46.753761 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vqjd2" podStartSLOduration=2.316273301 podStartE2EDuration="7.753743167s" podCreationTimestamp="2026-03-09 09:32:39 +0000 UTC" firstStartedPulling="2026-03-09 09:32:40.666986631 +0000 UTC m=+1525.697187383" lastFinishedPulling="2026-03-09 09:32:46.104456507 +0000 UTC m=+1531.134657249" observedRunningTime="2026-03-09 09:32:46.745812181 +0000 UTC m=+1531.776012943" watchObservedRunningTime="2026-03-09 09:32:46.753743167 +0000 UTC m=+1531.783943919" Mar 09 09:32:49 crc kubenswrapper[4792]: I0309 09:32:49.514398 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:49 crc kubenswrapper[4792]: I0309 09:32:49.514570 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:49 crc kubenswrapper[4792]: I0309 09:32:49.557635 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:57 crc kubenswrapper[4792]: I0309 09:32:57.842872 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zpz8v"] Mar 09 09:32:57 crc kubenswrapper[4792]: I0309 09:32:57.844923 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:32:57 crc kubenswrapper[4792]: I0309 09:32:57.866531 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zpz8v"] Mar 09 09:32:57 crc kubenswrapper[4792]: I0309 09:32:57.922183 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/616b854a-0dca-4d1a-986e-2f40faf6b0ce-utilities\") pod \"redhat-marketplace-zpz8v\" (UID: \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\") " pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:32:57 crc kubenswrapper[4792]: I0309 09:32:57.922265 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9t5l\" (UniqueName: \"kubernetes.io/projected/616b854a-0dca-4d1a-986e-2f40faf6b0ce-kube-api-access-s9t5l\") pod \"redhat-marketplace-zpz8v\" (UID: \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\") " pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:32:57 crc kubenswrapper[4792]: I0309 09:32:57.922369 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/616b854a-0dca-4d1a-986e-2f40faf6b0ce-catalog-content\") pod \"redhat-marketplace-zpz8v\" (UID: \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\") " pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:32:58 crc kubenswrapper[4792]: I0309 09:32:58.023916 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/616b854a-0dca-4d1a-986e-2f40faf6b0ce-catalog-content\") pod \"redhat-marketplace-zpz8v\" (UID: \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\") " pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:32:58 crc kubenswrapper[4792]: I0309 09:32:58.024031 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/616b854a-0dca-4d1a-986e-2f40faf6b0ce-utilities\") pod \"redhat-marketplace-zpz8v\" (UID: \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\") " pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:32:58 crc kubenswrapper[4792]: I0309 09:32:58.024165 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9t5l\" (UniqueName: \"kubernetes.io/projected/616b854a-0dca-4d1a-986e-2f40faf6b0ce-kube-api-access-s9t5l\") pod \"redhat-marketplace-zpz8v\" (UID: \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\") " pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:32:58 crc kubenswrapper[4792]: I0309 09:32:58.024549 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/616b854a-0dca-4d1a-986e-2f40faf6b0ce-utilities\") pod \"redhat-marketplace-zpz8v\" (UID: \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\") " pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:32:58 crc kubenswrapper[4792]: I0309 09:32:58.024571 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/616b854a-0dca-4d1a-986e-2f40faf6b0ce-catalog-content\") pod \"redhat-marketplace-zpz8v\" (UID: \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\") " pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:32:58 crc kubenswrapper[4792]: I0309 09:32:58.057083 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9t5l\" (UniqueName: \"kubernetes.io/projected/616b854a-0dca-4d1a-986e-2f40faf6b0ce-kube-api-access-s9t5l\") pod \"redhat-marketplace-zpz8v\" (UID: \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\") " pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:32:58 crc kubenswrapper[4792]: I0309 09:32:58.167685 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:32:58 crc kubenswrapper[4792]: I0309 09:32:58.687738 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zpz8v"] Mar 09 09:32:58 crc kubenswrapper[4792]: W0309 09:32:58.692823 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod616b854a_0dca_4d1a_986e_2f40faf6b0ce.slice/crio-9e86e4048d6f2f816ec6fc7f8568b571cc577b5ee718f4d4ac1767bfcbcf67a2 WatchSource:0}: Error finding container 9e86e4048d6f2f816ec6fc7f8568b571cc577b5ee718f4d4ac1767bfcbcf67a2: Status 404 returned error can't find the container with id 9e86e4048d6f2f816ec6fc7f8568b571cc577b5ee718f4d4ac1767bfcbcf67a2 Mar 09 09:32:58 crc kubenswrapper[4792]: I0309 09:32:58.850574 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zpz8v" event={"ID":"616b854a-0dca-4d1a-986e-2f40faf6b0ce","Type":"ContainerStarted","Data":"9e86e4048d6f2f816ec6fc7f8568b571cc577b5ee718f4d4ac1767bfcbcf67a2"} Mar 09 09:32:59 crc kubenswrapper[4792]: I0309 09:32:59.558736 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:32:59 crc kubenswrapper[4792]: I0309 09:32:59.859410 4792 generic.go:334] "Generic (PLEG): container finished" podID="616b854a-0dca-4d1a-986e-2f40faf6b0ce" containerID="27877bf111b511b01e3451456b4ea8da6f4dde7dee40b09e1278ac44e8044abc" exitCode=0 Mar 09 09:32:59 crc kubenswrapper[4792]: I0309 09:32:59.859456 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zpz8v" event={"ID":"616b854a-0dca-4d1a-986e-2f40faf6b0ce","Type":"ContainerDied","Data":"27877bf111b511b01e3451456b4ea8da6f4dde7dee40b09e1278ac44e8044abc"} Mar 09 09:33:00 crc kubenswrapper[4792]: I0309 09:33:00.870031 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zpz8v" event={"ID":"616b854a-0dca-4d1a-986e-2f40faf6b0ce","Type":"ContainerStarted","Data":"d589fbd7012553ec80366de6e5fd6f8236b61969417965240a17c1585d2b721d"} Mar 09 09:33:01 crc kubenswrapper[4792]: I0309 09:33:01.817212 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqjd2"] Mar 09 09:33:01 crc kubenswrapper[4792]: I0309 09:33:01.817826 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vqjd2" podUID="2ae84b32-8a77-45af-a24b-fdc6753b2d0c" containerName="registry-server" containerID="cri-o://f222caf0b8697f8832f3d002f9bd44772d323ece4323a89128f5a263c3564883" gracePeriod=2 Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.289048 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.413620 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-utilities\") pod \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\" (UID: \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\") " Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.413848 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fctbk\" (UniqueName: \"kubernetes.io/projected/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-kube-api-access-fctbk\") pod \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\" (UID: \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\") " Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.413982 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-catalog-content\") pod \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\" (UID: \"2ae84b32-8a77-45af-a24b-fdc6753b2d0c\") " Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.414502 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-utilities" (OuterVolumeSpecName: "utilities") pod "2ae84b32-8a77-45af-a24b-fdc6753b2d0c" (UID: "2ae84b32-8a77-45af-a24b-fdc6753b2d0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.424461 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-kube-api-access-fctbk" (OuterVolumeSpecName: "kube-api-access-fctbk") pod "2ae84b32-8a77-45af-a24b-fdc6753b2d0c" (UID: "2ae84b32-8a77-45af-a24b-fdc6753b2d0c"). InnerVolumeSpecName "kube-api-access-fctbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.483577 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ae84b32-8a77-45af-a24b-fdc6753b2d0c" (UID: "2ae84b32-8a77-45af-a24b-fdc6753b2d0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.516727 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fctbk\" (UniqueName: \"kubernetes.io/projected/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-kube-api-access-fctbk\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.516774 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.516784 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ae84b32-8a77-45af-a24b-fdc6753b2d0c-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.893410 4792 generic.go:334] "Generic (PLEG): container finished" podID="616b854a-0dca-4d1a-986e-2f40faf6b0ce" containerID="d589fbd7012553ec80366de6e5fd6f8236b61969417965240a17c1585d2b721d" exitCode=0 Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.893443 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zpz8v" event={"ID":"616b854a-0dca-4d1a-986e-2f40faf6b0ce","Type":"ContainerDied","Data":"d589fbd7012553ec80366de6e5fd6f8236b61969417965240a17c1585d2b721d"} Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.896902 4792 generic.go:334] "Generic (PLEG): container finished" podID="2ae84b32-8a77-45af-a24b-fdc6753b2d0c" containerID="f222caf0b8697f8832f3d002f9bd44772d323ece4323a89128f5a263c3564883" exitCode=0 Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.896932 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqjd2" event={"ID":"2ae84b32-8a77-45af-a24b-fdc6753b2d0c","Type":"ContainerDied","Data":"f222caf0b8697f8832f3d002f9bd44772d323ece4323a89128f5a263c3564883"} Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.896953 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqjd2" event={"ID":"2ae84b32-8a77-45af-a24b-fdc6753b2d0c","Type":"ContainerDied","Data":"62d559b7fc9b0ff2fe4e387b9ff1e224a93f6135eb7aabf6498448cc3e952087"} Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.896972 4792 scope.go:117] "RemoveContainer" containerID="f222caf0b8697f8832f3d002f9bd44772d323ece4323a89128f5a263c3564883" Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.897094 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqjd2" Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.935908 4792 scope.go:117] "RemoveContainer" containerID="f8e637349f4bcf9a6155ec93526ae6f11e4e5d33c369e3b047179623e7a0bb34" Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.964662 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqjd2"] Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.977691 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vqjd2"] Mar 09 09:33:02 crc kubenswrapper[4792]: I0309 09:33:02.978304 4792 scope.go:117] "RemoveContainer" containerID="00f5108a0e559419812533ea641ea30a1386080cf275437fc164e4421baee5e3" Mar 09 09:33:03 crc kubenswrapper[4792]: I0309 09:33:03.020462 4792 scope.go:117] "RemoveContainer" containerID="f222caf0b8697f8832f3d002f9bd44772d323ece4323a89128f5a263c3564883" Mar 09 09:33:03 crc kubenswrapper[4792]: E0309 09:33:03.021527 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f222caf0b8697f8832f3d002f9bd44772d323ece4323a89128f5a263c3564883\": container with ID starting with f222caf0b8697f8832f3d002f9bd44772d323ece4323a89128f5a263c3564883 not found: ID does not exist" containerID="f222caf0b8697f8832f3d002f9bd44772d323ece4323a89128f5a263c3564883" Mar 09 09:33:03 crc kubenswrapper[4792]: I0309 09:33:03.021774 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f222caf0b8697f8832f3d002f9bd44772d323ece4323a89128f5a263c3564883"} err="failed to get container status \"f222caf0b8697f8832f3d002f9bd44772d323ece4323a89128f5a263c3564883\": rpc error: code = NotFound desc = could not find container \"f222caf0b8697f8832f3d002f9bd44772d323ece4323a89128f5a263c3564883\": container with ID starting with f222caf0b8697f8832f3d002f9bd44772d323ece4323a89128f5a263c3564883 not found: ID does not exist" Mar 09 09:33:03 crc kubenswrapper[4792]: I0309 09:33:03.021874 4792 scope.go:117] "RemoveContainer" containerID="f8e637349f4bcf9a6155ec93526ae6f11e4e5d33c369e3b047179623e7a0bb34" Mar 09 09:33:03 crc kubenswrapper[4792]: E0309 09:33:03.023240 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8e637349f4bcf9a6155ec93526ae6f11e4e5d33c369e3b047179623e7a0bb34\": container with ID starting with f8e637349f4bcf9a6155ec93526ae6f11e4e5d33c369e3b047179623e7a0bb34 not found: ID does not exist" containerID="f8e637349f4bcf9a6155ec93526ae6f11e4e5d33c369e3b047179623e7a0bb34" Mar 09 09:33:03 crc kubenswrapper[4792]: I0309 09:33:03.023290 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8e637349f4bcf9a6155ec93526ae6f11e4e5d33c369e3b047179623e7a0bb34"} err="failed to get container status \"f8e637349f4bcf9a6155ec93526ae6f11e4e5d33c369e3b047179623e7a0bb34\": rpc error: code = NotFound desc = could not find container \"f8e637349f4bcf9a6155ec93526ae6f11e4e5d33c369e3b047179623e7a0bb34\": container with ID starting with f8e637349f4bcf9a6155ec93526ae6f11e4e5d33c369e3b047179623e7a0bb34 not found: ID does not exist" Mar 09 09:33:03 crc kubenswrapper[4792]: I0309 09:33:03.023320 4792 scope.go:117] "RemoveContainer" containerID="00f5108a0e559419812533ea641ea30a1386080cf275437fc164e4421baee5e3" Mar 09 09:33:03 crc kubenswrapper[4792]: E0309 09:33:03.023860 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f5108a0e559419812533ea641ea30a1386080cf275437fc164e4421baee5e3\": container with ID starting with 00f5108a0e559419812533ea641ea30a1386080cf275437fc164e4421baee5e3 not found: ID does not exist" containerID="00f5108a0e559419812533ea641ea30a1386080cf275437fc164e4421baee5e3" Mar 09 09:33:03 crc kubenswrapper[4792]: I0309 09:33:03.023924 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f5108a0e559419812533ea641ea30a1386080cf275437fc164e4421baee5e3"} err="failed to get container status \"00f5108a0e559419812533ea641ea30a1386080cf275437fc164e4421baee5e3\": rpc error: code = NotFound desc = could not find container \"00f5108a0e559419812533ea641ea30a1386080cf275437fc164e4421baee5e3\": container with ID starting with 00f5108a0e559419812533ea641ea30a1386080cf275437fc164e4421baee5e3 not found: ID does not exist" Mar 09 09:33:03 crc kubenswrapper[4792]: E0309 09:33:03.140256 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ae84b32_8a77_45af_a24b_fdc6753b2d0c.slice/crio-62d559b7fc9b0ff2fe4e387b9ff1e224a93f6135eb7aabf6498448cc3e952087\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ae84b32_8a77_45af_a24b_fdc6753b2d0c.slice\": RecentStats: unable to find data in memory cache]" Mar 09 09:33:03 crc kubenswrapper[4792]: I0309 09:33:03.676536 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ae84b32-8a77-45af-a24b-fdc6753b2d0c" path="/var/lib/kubelet/pods/2ae84b32-8a77-45af-a24b-fdc6753b2d0c/volumes" Mar 09 09:33:03 crc kubenswrapper[4792]: I0309 09:33:03.906712 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zpz8v" event={"ID":"616b854a-0dca-4d1a-986e-2f40faf6b0ce","Type":"ContainerStarted","Data":"d5ff53d3cc4d28f571fce16581910157d7a83222fdb950d4fc5e05f00fe1c81b"} Mar 09 09:33:03 crc kubenswrapper[4792]: I0309 09:33:03.925956 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zpz8v" podStartSLOduration=3.468430034 podStartE2EDuration="6.925941958s" podCreationTimestamp="2026-03-09 09:32:57 +0000 UTC" firstStartedPulling="2026-03-09 09:32:59.861651449 +0000 UTC m=+1544.891852201" lastFinishedPulling="2026-03-09 09:33:03.319163373 +0000 UTC m=+1548.349364125" observedRunningTime="2026-03-09 09:33:03.924236547 +0000 UTC m=+1548.954437329" watchObservedRunningTime="2026-03-09 09:33:03.925941958 +0000 UTC m=+1548.956142710" Mar 09 09:33:08 crc kubenswrapper[4792]: I0309 09:33:08.168540 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:33:08 crc kubenswrapper[4792]: I0309 09:33:08.169362 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:33:08 crc kubenswrapper[4792]: I0309 09:33:08.223272 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:33:09 crc kubenswrapper[4792]: I0309 09:33:09.015282 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:33:09 crc kubenswrapper[4792]: I0309 09:33:09.063202 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zpz8v"] Mar 09 09:33:10 crc kubenswrapper[4792]: I0309 09:33:10.986600 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zpz8v" podUID="616b854a-0dca-4d1a-986e-2f40faf6b0ce" containerName="registry-server" containerID="cri-o://d5ff53d3cc4d28f571fce16581910157d7a83222fdb950d4fc5e05f00fe1c81b" gracePeriod=2 Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.493095 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.585418 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/616b854a-0dca-4d1a-986e-2f40faf6b0ce-utilities\") pod \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\" (UID: \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\") " Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.585842 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/616b854a-0dca-4d1a-986e-2f40faf6b0ce-catalog-content\") pod \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\" (UID: \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\") " Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.585866 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9t5l\" (UniqueName: \"kubernetes.io/projected/616b854a-0dca-4d1a-986e-2f40faf6b0ce-kube-api-access-s9t5l\") pod \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\" (UID: \"616b854a-0dca-4d1a-986e-2f40faf6b0ce\") " Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.586629 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/616b854a-0dca-4d1a-986e-2f40faf6b0ce-utilities" (OuterVolumeSpecName: "utilities") pod "616b854a-0dca-4d1a-986e-2f40faf6b0ce" (UID: "616b854a-0dca-4d1a-986e-2f40faf6b0ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.592715 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/616b854a-0dca-4d1a-986e-2f40faf6b0ce-kube-api-access-s9t5l" (OuterVolumeSpecName: "kube-api-access-s9t5l") pod "616b854a-0dca-4d1a-986e-2f40faf6b0ce" (UID: "616b854a-0dca-4d1a-986e-2f40faf6b0ce"). InnerVolumeSpecName "kube-api-access-s9t5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.619260 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/616b854a-0dca-4d1a-986e-2f40faf6b0ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "616b854a-0dca-4d1a-986e-2f40faf6b0ce" (UID: "616b854a-0dca-4d1a-986e-2f40faf6b0ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.688011 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/616b854a-0dca-4d1a-986e-2f40faf6b0ce-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.688048 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9t5l\" (UniqueName: \"kubernetes.io/projected/616b854a-0dca-4d1a-986e-2f40faf6b0ce-kube-api-access-s9t5l\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.688061 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/616b854a-0dca-4d1a-986e-2f40faf6b0ce-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.997581 4792 generic.go:334] "Generic (PLEG): container finished" podID="616b854a-0dca-4d1a-986e-2f40faf6b0ce" containerID="d5ff53d3cc4d28f571fce16581910157d7a83222fdb950d4fc5e05f00fe1c81b" exitCode=0 Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.997629 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zpz8v" event={"ID":"616b854a-0dca-4d1a-986e-2f40faf6b0ce","Type":"ContainerDied","Data":"d5ff53d3cc4d28f571fce16581910157d7a83222fdb950d4fc5e05f00fe1c81b"} Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.997657 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zpz8v" event={"ID":"616b854a-0dca-4d1a-986e-2f40faf6b0ce","Type":"ContainerDied","Data":"9e86e4048d6f2f816ec6fc7f8568b571cc577b5ee718f4d4ac1767bfcbcf67a2"} Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.997675 4792 scope.go:117] "RemoveContainer" containerID="d5ff53d3cc4d28f571fce16581910157d7a83222fdb950d4fc5e05f00fe1c81b" Mar 09 09:33:11 crc kubenswrapper[4792]: I0309 09:33:11.997819 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zpz8v" Mar 09 09:33:12 crc kubenswrapper[4792]: I0309 09:33:12.028799 4792 scope.go:117] "RemoveContainer" containerID="d589fbd7012553ec80366de6e5fd6f8236b61969417965240a17c1585d2b721d" Mar 09 09:33:12 crc kubenswrapper[4792]: I0309 09:33:12.029856 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zpz8v"] Mar 09 09:33:12 crc kubenswrapper[4792]: I0309 09:33:12.047642 4792 scope.go:117] "RemoveContainer" containerID="27877bf111b511b01e3451456b4ea8da6f4dde7dee40b09e1278ac44e8044abc" Mar 09 09:33:12 crc kubenswrapper[4792]: I0309 09:33:12.064773 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zpz8v"] Mar 09 09:33:12 crc kubenswrapper[4792]: I0309 09:33:12.087974 4792 scope.go:117] "RemoveContainer" containerID="d5ff53d3cc4d28f571fce16581910157d7a83222fdb950d4fc5e05f00fe1c81b" Mar 09 09:33:12 crc kubenswrapper[4792]: E0309 09:33:12.088366 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ff53d3cc4d28f571fce16581910157d7a83222fdb950d4fc5e05f00fe1c81b\": container with ID starting with d5ff53d3cc4d28f571fce16581910157d7a83222fdb950d4fc5e05f00fe1c81b not found: ID does not exist" containerID="d5ff53d3cc4d28f571fce16581910157d7a83222fdb950d4fc5e05f00fe1c81b" Mar 09 09:33:12 crc kubenswrapper[4792]: I0309 09:33:12.088398 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ff53d3cc4d28f571fce16581910157d7a83222fdb950d4fc5e05f00fe1c81b"} err="failed to get container status \"d5ff53d3cc4d28f571fce16581910157d7a83222fdb950d4fc5e05f00fe1c81b\": rpc error: code = NotFound desc = could not find container \"d5ff53d3cc4d28f571fce16581910157d7a83222fdb950d4fc5e05f00fe1c81b\": container with ID starting with d5ff53d3cc4d28f571fce16581910157d7a83222fdb950d4fc5e05f00fe1c81b not found: ID does not exist" Mar 09 09:33:12 crc kubenswrapper[4792]: I0309 09:33:12.088417 4792 scope.go:117] "RemoveContainer" containerID="d589fbd7012553ec80366de6e5fd6f8236b61969417965240a17c1585d2b721d" Mar 09 09:33:12 crc kubenswrapper[4792]: E0309 09:33:12.088871 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d589fbd7012553ec80366de6e5fd6f8236b61969417965240a17c1585d2b721d\": container with ID starting with d589fbd7012553ec80366de6e5fd6f8236b61969417965240a17c1585d2b721d not found: ID does not exist" containerID="d589fbd7012553ec80366de6e5fd6f8236b61969417965240a17c1585d2b721d" Mar 09 09:33:12 crc kubenswrapper[4792]: I0309 09:33:12.088905 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d589fbd7012553ec80366de6e5fd6f8236b61969417965240a17c1585d2b721d"} err="failed to get container status \"d589fbd7012553ec80366de6e5fd6f8236b61969417965240a17c1585d2b721d\": rpc error: code = NotFound desc = could not find container \"d589fbd7012553ec80366de6e5fd6f8236b61969417965240a17c1585d2b721d\": container with ID starting with d589fbd7012553ec80366de6e5fd6f8236b61969417965240a17c1585d2b721d not found: ID does not exist" Mar 09 09:33:12 crc kubenswrapper[4792]: I0309 09:33:12.088923 4792 scope.go:117] "RemoveContainer" containerID="27877bf111b511b01e3451456b4ea8da6f4dde7dee40b09e1278ac44e8044abc" Mar 09 09:33:12 crc kubenswrapper[4792]: E0309 09:33:12.089157 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27877bf111b511b01e3451456b4ea8da6f4dde7dee40b09e1278ac44e8044abc\": container with ID starting with 27877bf111b511b01e3451456b4ea8da6f4dde7dee40b09e1278ac44e8044abc not found: ID does not exist" containerID="27877bf111b511b01e3451456b4ea8da6f4dde7dee40b09e1278ac44e8044abc" Mar 09 09:33:12 crc kubenswrapper[4792]: I0309 09:33:12.089189 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27877bf111b511b01e3451456b4ea8da6f4dde7dee40b09e1278ac44e8044abc"} err="failed to get container status \"27877bf111b511b01e3451456b4ea8da6f4dde7dee40b09e1278ac44e8044abc\": rpc error: code = NotFound desc = could not find container \"27877bf111b511b01e3451456b4ea8da6f4dde7dee40b09e1278ac44e8044abc\": container with ID starting with 27877bf111b511b01e3451456b4ea8da6f4dde7dee40b09e1278ac44e8044abc not found: ID does not exist" Mar 09 09:33:13 crc kubenswrapper[4792]: I0309 09:33:13.213765 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:33:13 crc kubenswrapper[4792]: I0309 09:33:13.214290 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:33:13 crc kubenswrapper[4792]: I0309 09:33:13.674817 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="616b854a-0dca-4d1a-986e-2f40faf6b0ce" path="/var/lib/kubelet/pods/616b854a-0dca-4d1a-986e-2f40faf6b0ce/volumes" Mar 09 09:33:36 crc kubenswrapper[4792]: I0309 09:33:36.561224 4792 scope.go:117] "RemoveContainer" containerID="c96a19ff2f59602ea56ae7911d4e876641fc0a43b0a5c70f885180482fe6f265" Mar 09 09:33:36 crc kubenswrapper[4792]: I0309 09:33:36.583994 4792 scope.go:117] "RemoveContainer" containerID="14f2ceffbc3dff050abc449151ec442615efd4ae5b2dfe03dd08b701d6fd619d" Mar 09 09:33:36 crc kubenswrapper[4792]: I0309 09:33:36.603690 4792 scope.go:117] "RemoveContainer" containerID="45dc51a7fcbaafcdef79c0fb213dfea359f9ba0d8f0413168282ad54c40a8d28" Mar 09 09:33:36 crc kubenswrapper[4792]: I0309 09:33:36.631613 4792 scope.go:117] "RemoveContainer" containerID="6ed260b81fa4eecf1111f1318337d8dac3f19ef0e3602bfa7d40018fcc911f6e" Mar 09 09:33:36 crc kubenswrapper[4792]: I0309 09:33:36.652226 4792 scope.go:117] "RemoveContainer" containerID="ac6c9779f534b4a3049af4bbb6f14ce0c56b7f97c973afb2a2e643eeefe7c69d" Mar 09 09:33:43 crc kubenswrapper[4792]: I0309 09:33:43.214228 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:33:43 crc kubenswrapper[4792]: I0309 09:33:43.214744 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:33:43 crc kubenswrapper[4792]: I0309 09:33:43.214787 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:33:43 crc kubenswrapper[4792]: I0309 09:33:43.215466 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:33:43 crc kubenswrapper[4792]: I0309 09:33:43.215518 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" gracePeriod=600 Mar 09 09:33:43 crc kubenswrapper[4792]: E0309 09:33:43.866061 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:33:44 crc kubenswrapper[4792]: I0309 09:33:44.286126 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" exitCode=0 Mar 09 09:33:44 crc kubenswrapper[4792]: I0309 09:33:44.286183 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92"} Mar 09 09:33:44 crc kubenswrapper[4792]: I0309 09:33:44.286221 4792 scope.go:117] "RemoveContainer" containerID="80c12a8064763d9c808b56945a8c97d0c627b1e8c20ccecc1138b2635c8e12bd" Mar 09 09:33:44 crc kubenswrapper[4792]: I0309 09:33:44.286900 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:33:44 crc kubenswrapper[4792]: E0309 09:33:44.287325 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:33:55 crc kubenswrapper[4792]: I0309 09:33:55.662236 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:33:55 crc kubenswrapper[4792]: E0309 09:33:55.663949 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.146054 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550814-5vplj"] Mar 09 09:34:00 crc kubenswrapper[4792]: E0309 09:34:00.146938 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616b854a-0dca-4d1a-986e-2f40faf6b0ce" containerName="extract-content" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.146951 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="616b854a-0dca-4d1a-986e-2f40faf6b0ce" containerName="extract-content" Mar 09 09:34:00 crc kubenswrapper[4792]: E0309 09:34:00.146960 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae84b32-8a77-45af-a24b-fdc6753b2d0c" containerName="extract-utilities" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.146966 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae84b32-8a77-45af-a24b-fdc6753b2d0c" containerName="extract-utilities" Mar 09 09:34:00 crc kubenswrapper[4792]: E0309 09:34:00.146987 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae84b32-8a77-45af-a24b-fdc6753b2d0c" containerName="extract-content" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.146993 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae84b32-8a77-45af-a24b-fdc6753b2d0c" containerName="extract-content" Mar 09 09:34:00 crc kubenswrapper[4792]: E0309 09:34:00.147017 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616b854a-0dca-4d1a-986e-2f40faf6b0ce" containerName="registry-server" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.147023 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="616b854a-0dca-4d1a-986e-2f40faf6b0ce" containerName="registry-server" Mar 09 09:34:00 crc kubenswrapper[4792]: E0309 09:34:00.147030 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae84b32-8a77-45af-a24b-fdc6753b2d0c" containerName="registry-server" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.147035 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae84b32-8a77-45af-a24b-fdc6753b2d0c" containerName="registry-server" Mar 09 09:34:00 crc kubenswrapper[4792]: E0309 09:34:00.147047 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616b854a-0dca-4d1a-986e-2f40faf6b0ce" containerName="extract-utilities" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.147053 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="616b854a-0dca-4d1a-986e-2f40faf6b0ce" containerName="extract-utilities" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.147222 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="616b854a-0dca-4d1a-986e-2f40faf6b0ce" containerName="registry-server" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.147264 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae84b32-8a77-45af-a24b-fdc6753b2d0c" containerName="registry-server" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.147830 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550814-5vplj" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.152459 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.153813 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.153899 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.157576 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550814-5vplj"] Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.246826 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrpms\" (UniqueName: \"kubernetes.io/projected/afe8b396-f7ab-49c1-af21-5583ce0a3342-kube-api-access-nrpms\") pod \"auto-csr-approver-29550814-5vplj\" (UID: \"afe8b396-f7ab-49c1-af21-5583ce0a3342\") " pod="openshift-infra/auto-csr-approver-29550814-5vplj" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.348146 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrpms\" (UniqueName: \"kubernetes.io/projected/afe8b396-f7ab-49c1-af21-5583ce0a3342-kube-api-access-nrpms\") pod \"auto-csr-approver-29550814-5vplj\" (UID: \"afe8b396-f7ab-49c1-af21-5583ce0a3342\") " pod="openshift-infra/auto-csr-approver-29550814-5vplj" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.370013 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrpms\" (UniqueName: \"kubernetes.io/projected/afe8b396-f7ab-49c1-af21-5583ce0a3342-kube-api-access-nrpms\") pod \"auto-csr-approver-29550814-5vplj\" (UID: \"afe8b396-f7ab-49c1-af21-5583ce0a3342\") " pod="openshift-infra/auto-csr-approver-29550814-5vplj" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.464595 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550814-5vplj" Mar 09 09:34:00 crc kubenswrapper[4792]: I0309 09:34:00.924154 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550814-5vplj"] Mar 09 09:34:00 crc kubenswrapper[4792]: W0309 09:34:00.934295 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafe8b396_f7ab_49c1_af21_5583ce0a3342.slice/crio-d48c60ec7671da1c10d5ab9e675e14b56d1b4f5926f2d0b6cb34fbbdc852281a WatchSource:0}: Error finding container d48c60ec7671da1c10d5ab9e675e14b56d1b4f5926f2d0b6cb34fbbdc852281a: Status 404 returned error can't find the container with id d48c60ec7671da1c10d5ab9e675e14b56d1b4f5926f2d0b6cb34fbbdc852281a Mar 09 09:34:01 crc kubenswrapper[4792]: I0309 09:34:01.436102 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550814-5vplj" event={"ID":"afe8b396-f7ab-49c1-af21-5583ce0a3342","Type":"ContainerStarted","Data":"d48c60ec7671da1c10d5ab9e675e14b56d1b4f5926f2d0b6cb34fbbdc852281a"} Mar 09 09:34:02 crc kubenswrapper[4792]: I0309 09:34:02.446380 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550814-5vplj" event={"ID":"afe8b396-f7ab-49c1-af21-5583ce0a3342","Type":"ContainerStarted","Data":"833acc7a1c95bf382ddc6bf917ea00d42caa207f55f8a92ccc8390d630b3e9a2"} Mar 09 09:34:02 crc kubenswrapper[4792]: I0309 09:34:02.467705 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550814-5vplj" podStartSLOduration=1.497571876 podStartE2EDuration="2.467683887s" podCreationTimestamp="2026-03-09 09:34:00 +0000 UTC" firstStartedPulling="2026-03-09 09:34:00.941424707 +0000 UTC m=+1605.971625459" lastFinishedPulling="2026-03-09 09:34:01.911536728 +0000 UTC m=+1606.941737470" observedRunningTime="2026-03-09 09:34:02.466214484 +0000 UTC m=+1607.496415236" watchObservedRunningTime="2026-03-09 09:34:02.467683887 +0000 UTC m=+1607.497884650" Mar 09 09:34:03 crc kubenswrapper[4792]: I0309 09:34:03.455788 4792 generic.go:334] "Generic (PLEG): container finished" podID="afe8b396-f7ab-49c1-af21-5583ce0a3342" containerID="833acc7a1c95bf382ddc6bf917ea00d42caa207f55f8a92ccc8390d630b3e9a2" exitCode=0 Mar 09 09:34:03 crc kubenswrapper[4792]: I0309 09:34:03.455844 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550814-5vplj" event={"ID":"afe8b396-f7ab-49c1-af21-5583ce0a3342","Type":"ContainerDied","Data":"833acc7a1c95bf382ddc6bf917ea00d42caa207f55f8a92ccc8390d630b3e9a2"} Mar 09 09:34:04 crc kubenswrapper[4792]: I0309 09:34:04.827158 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550814-5vplj" Mar 09 09:34:04 crc kubenswrapper[4792]: I0309 09:34:04.930924 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrpms\" (UniqueName: \"kubernetes.io/projected/afe8b396-f7ab-49c1-af21-5583ce0a3342-kube-api-access-nrpms\") pod \"afe8b396-f7ab-49c1-af21-5583ce0a3342\" (UID: \"afe8b396-f7ab-49c1-af21-5583ce0a3342\") " Mar 09 09:34:04 crc kubenswrapper[4792]: I0309 09:34:04.936515 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe8b396-f7ab-49c1-af21-5583ce0a3342-kube-api-access-nrpms" (OuterVolumeSpecName: "kube-api-access-nrpms") pod "afe8b396-f7ab-49c1-af21-5583ce0a3342" (UID: "afe8b396-f7ab-49c1-af21-5583ce0a3342"). InnerVolumeSpecName "kube-api-access-nrpms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:34:05 crc kubenswrapper[4792]: I0309 09:34:05.033396 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrpms\" (UniqueName: \"kubernetes.io/projected/afe8b396-f7ab-49c1-af21-5583ce0a3342-kube-api-access-nrpms\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:05 crc kubenswrapper[4792]: I0309 09:34:05.477445 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550814-5vplj" event={"ID":"afe8b396-f7ab-49c1-af21-5583ce0a3342","Type":"ContainerDied","Data":"d48c60ec7671da1c10d5ab9e675e14b56d1b4f5926f2d0b6cb34fbbdc852281a"} Mar 09 09:34:05 crc kubenswrapper[4792]: I0309 09:34:05.477503 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d48c60ec7671da1c10d5ab9e675e14b56d1b4f5926f2d0b6cb34fbbdc852281a" Mar 09 09:34:05 crc kubenswrapper[4792]: I0309 09:34:05.477578 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550814-5vplj" Mar 09 09:34:05 crc kubenswrapper[4792]: I0309 09:34:05.555916 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550808-tq64m"] Mar 09 09:34:05 crc kubenswrapper[4792]: I0309 09:34:05.564182 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550808-tq64m"] Mar 09 09:34:05 crc kubenswrapper[4792]: I0309 09:34:05.673737 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78e893e-c5f1-4524-9d89-39624fbaec63" path="/var/lib/kubelet/pods/a78e893e-c5f1-4524-9d89-39624fbaec63/volumes" Mar 09 09:34:09 crc kubenswrapper[4792]: I0309 09:34:09.662329 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:34:09 crc kubenswrapper[4792]: E0309 09:34:09.663026 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.270977 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qbwwg"] Mar 09 09:34:11 crc kubenswrapper[4792]: E0309 09:34:11.271454 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe8b396-f7ab-49c1-af21-5583ce0a3342" containerName="oc" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.271470 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe8b396-f7ab-49c1-af21-5583ce0a3342" containerName="oc" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.271712 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe8b396-f7ab-49c1-af21-5583ce0a3342" containerName="oc" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.273259 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.285696 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qbwwg"] Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.350751 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b34ef7c3-9371-4b4d-b917-e0ade6699991-catalog-content\") pod \"community-operators-qbwwg\" (UID: \"b34ef7c3-9371-4b4d-b917-e0ade6699991\") " pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.350807 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b34ef7c3-9371-4b4d-b917-e0ade6699991-utilities\") pod \"community-operators-qbwwg\" (UID: \"b34ef7c3-9371-4b4d-b917-e0ade6699991\") " pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.350953 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbrzx\" (UniqueName: \"kubernetes.io/projected/b34ef7c3-9371-4b4d-b917-e0ade6699991-kube-api-access-jbrzx\") pod \"community-operators-qbwwg\" (UID: \"b34ef7c3-9371-4b4d-b917-e0ade6699991\") " pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.452296 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b34ef7c3-9371-4b4d-b917-e0ade6699991-catalog-content\") pod \"community-operators-qbwwg\" (UID: \"b34ef7c3-9371-4b4d-b917-e0ade6699991\") " pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.452869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b34ef7c3-9371-4b4d-b917-e0ade6699991-utilities\") pod \"community-operators-qbwwg\" (UID: \"b34ef7c3-9371-4b4d-b917-e0ade6699991\") " pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.453091 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbrzx\" (UniqueName: \"kubernetes.io/projected/b34ef7c3-9371-4b4d-b917-e0ade6699991-kube-api-access-jbrzx\") pod \"community-operators-qbwwg\" (UID: \"b34ef7c3-9371-4b4d-b917-e0ade6699991\") " pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.453352 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b34ef7c3-9371-4b4d-b917-e0ade6699991-catalog-content\") pod \"community-operators-qbwwg\" (UID: \"b34ef7c3-9371-4b4d-b917-e0ade6699991\") " pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.453552 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b34ef7c3-9371-4b4d-b917-e0ade6699991-utilities\") pod \"community-operators-qbwwg\" (UID: \"b34ef7c3-9371-4b4d-b917-e0ade6699991\") " pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.478441 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbrzx\" (UniqueName: \"kubernetes.io/projected/b34ef7c3-9371-4b4d-b917-e0ade6699991-kube-api-access-jbrzx\") pod \"community-operators-qbwwg\" (UID: \"b34ef7c3-9371-4b4d-b917-e0ade6699991\") " pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:11 crc kubenswrapper[4792]: I0309 09:34:11.600148 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:12 crc kubenswrapper[4792]: I0309 09:34:12.239123 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qbwwg"] Mar 09 09:34:12 crc kubenswrapper[4792]: W0309 09:34:12.242668 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb34ef7c3_9371_4b4d_b917_e0ade6699991.slice/crio-f0b3c2fc883811420927b33e6e1fed4a2f76024c0acb52995e054ff1a1cb7189 WatchSource:0}: Error finding container f0b3c2fc883811420927b33e6e1fed4a2f76024c0acb52995e054ff1a1cb7189: Status 404 returned error can't find the container with id f0b3c2fc883811420927b33e6e1fed4a2f76024c0acb52995e054ff1a1cb7189 Mar 09 09:34:12 crc kubenswrapper[4792]: I0309 09:34:12.544140 4792 generic.go:334] "Generic (PLEG): container finished" podID="b34ef7c3-9371-4b4d-b917-e0ade6699991" containerID="be77829491867b5031a50f2721e96dd685d83b1fa711aa7d494f24a1848f0e4c" exitCode=0 Mar 09 09:34:12 crc kubenswrapper[4792]: I0309 09:34:12.544369 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbwwg" event={"ID":"b34ef7c3-9371-4b4d-b917-e0ade6699991","Type":"ContainerDied","Data":"be77829491867b5031a50f2721e96dd685d83b1fa711aa7d494f24a1848f0e4c"} Mar 09 09:34:12 crc kubenswrapper[4792]: I0309 09:34:12.544492 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbwwg" event={"ID":"b34ef7c3-9371-4b4d-b917-e0ade6699991","Type":"ContainerStarted","Data":"f0b3c2fc883811420927b33e6e1fed4a2f76024c0acb52995e054ff1a1cb7189"} Mar 09 09:34:19 crc kubenswrapper[4792]: I0309 09:34:19.611208 4792 generic.go:334] "Generic (PLEG): container finished" podID="b34ef7c3-9371-4b4d-b917-e0ade6699991" containerID="8e999fc597de5a5745e11b7c879956b243e91499e5210f9d4f13a46fd16a745c" exitCode=0 Mar 09 09:34:19 crc kubenswrapper[4792]: I0309 09:34:19.611295 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbwwg" event={"ID":"b34ef7c3-9371-4b4d-b917-e0ade6699991","Type":"ContainerDied","Data":"8e999fc597de5a5745e11b7c879956b243e91499e5210f9d4f13a46fd16a745c"} Mar 09 09:34:20 crc kubenswrapper[4792]: I0309 09:34:20.622274 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbwwg" event={"ID":"b34ef7c3-9371-4b4d-b917-e0ade6699991","Type":"ContainerStarted","Data":"a420b5f6f3275b3be694adc2d6a8ac0e48b9a2cc8571f912d2c7ab4722ffb6eb"} Mar 09 09:34:20 crc kubenswrapper[4792]: I0309 09:34:20.648879 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qbwwg" podStartSLOduration=1.831475281 podStartE2EDuration="9.64885396s" podCreationTimestamp="2026-03-09 09:34:11 +0000 UTC" firstStartedPulling="2026-03-09 09:34:12.545661601 +0000 UTC m=+1617.575862353" lastFinishedPulling="2026-03-09 09:34:20.36304028 +0000 UTC m=+1625.393241032" observedRunningTime="2026-03-09 09:34:20.640777066 +0000 UTC m=+1625.670977818" watchObservedRunningTime="2026-03-09 09:34:20.64885396 +0000 UTC m=+1625.679054712" Mar 09 09:34:21 crc kubenswrapper[4792]: I0309 09:34:21.601418 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:21 crc kubenswrapper[4792]: I0309 09:34:21.601741 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:22 crc kubenswrapper[4792]: I0309 09:34:22.665498 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qbwwg" podUID="b34ef7c3-9371-4b4d-b917-e0ade6699991" containerName="registry-server" probeResult="failure" output=< Mar 09 09:34:22 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 09:34:22 crc kubenswrapper[4792]: > Mar 09 09:34:24 crc kubenswrapper[4792]: I0309 09:34:24.662144 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:34:24 crc kubenswrapper[4792]: E0309 09:34:24.662774 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:34:31 crc kubenswrapper[4792]: I0309 09:34:31.651157 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:31 crc kubenswrapper[4792]: I0309 09:34:31.708554 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qbwwg" Mar 09 09:34:31 crc kubenswrapper[4792]: I0309 09:34:31.799417 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qbwwg"] Mar 09 09:34:31 crc kubenswrapper[4792]: I0309 09:34:31.894173 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r4pf7"] Mar 09 09:34:31 crc kubenswrapper[4792]: I0309 09:34:31.894406 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r4pf7" podUID="7344810f-1120-46c5-af73-cd48477e1143" containerName="registry-server" containerID="cri-o://525a735387e8304627db5de420d3fdb43e2c065b58f491effb834cad16c16b18" gracePeriod=2 Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.414785 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.548536 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82cbf\" (UniqueName: \"kubernetes.io/projected/7344810f-1120-46c5-af73-cd48477e1143-kube-api-access-82cbf\") pod \"7344810f-1120-46c5-af73-cd48477e1143\" (UID: \"7344810f-1120-46c5-af73-cd48477e1143\") " Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.548603 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7344810f-1120-46c5-af73-cd48477e1143-utilities\") pod \"7344810f-1120-46c5-af73-cd48477e1143\" (UID: \"7344810f-1120-46c5-af73-cd48477e1143\") " Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.548622 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7344810f-1120-46c5-af73-cd48477e1143-catalog-content\") pod \"7344810f-1120-46c5-af73-cd48477e1143\" (UID: \"7344810f-1120-46c5-af73-cd48477e1143\") " Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.549379 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7344810f-1120-46c5-af73-cd48477e1143-utilities" (OuterVolumeSpecName: "utilities") pod "7344810f-1120-46c5-af73-cd48477e1143" (UID: "7344810f-1120-46c5-af73-cd48477e1143"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.553852 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7344810f-1120-46c5-af73-cd48477e1143-kube-api-access-82cbf" (OuterVolumeSpecName: "kube-api-access-82cbf") pod "7344810f-1120-46c5-af73-cd48477e1143" (UID: "7344810f-1120-46c5-af73-cd48477e1143"). InnerVolumeSpecName "kube-api-access-82cbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.598791 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7344810f-1120-46c5-af73-cd48477e1143-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7344810f-1120-46c5-af73-cd48477e1143" (UID: "7344810f-1120-46c5-af73-cd48477e1143"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.651041 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82cbf\" (UniqueName: \"kubernetes.io/projected/7344810f-1120-46c5-af73-cd48477e1143-kube-api-access-82cbf\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.651086 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7344810f-1120-46c5-af73-cd48477e1143-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.651097 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7344810f-1120-46c5-af73-cd48477e1143-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.755434 4792 generic.go:334] "Generic (PLEG): container finished" podID="7344810f-1120-46c5-af73-cd48477e1143" containerID="525a735387e8304627db5de420d3fdb43e2c065b58f491effb834cad16c16b18" exitCode=0 Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.755529 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4pf7" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.755559 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4pf7" event={"ID":"7344810f-1120-46c5-af73-cd48477e1143","Type":"ContainerDied","Data":"525a735387e8304627db5de420d3fdb43e2c065b58f491effb834cad16c16b18"} Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.755588 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4pf7" event={"ID":"7344810f-1120-46c5-af73-cd48477e1143","Type":"ContainerDied","Data":"94bbc9628628a1cd1ca91968ca7634c7f27d43dc6f1c42ce6ac7059a7a132ec9"} Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.755610 4792 scope.go:117] "RemoveContainer" containerID="525a735387e8304627db5de420d3fdb43e2c065b58f491effb834cad16c16b18" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.780957 4792 scope.go:117] "RemoveContainer" containerID="482a7215f07b106d54aafd366fd817aabed5d5f936a4577645d2d03c20ffc652" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.805383 4792 scope.go:117] "RemoveContainer" containerID="271e3cb63dfa5cbef6ff4e64917563772ccc1da7d35974bb9c0dd927cb9ae7c8" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.815252 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r4pf7"] Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.824182 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r4pf7"] Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.855330 4792 scope.go:117] "RemoveContainer" containerID="525a735387e8304627db5de420d3fdb43e2c065b58f491effb834cad16c16b18" Mar 09 09:34:32 crc kubenswrapper[4792]: E0309 09:34:32.855728 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"525a735387e8304627db5de420d3fdb43e2c065b58f491effb834cad16c16b18\": container with ID starting with 525a735387e8304627db5de420d3fdb43e2c065b58f491effb834cad16c16b18 not found: ID does not exist" containerID="525a735387e8304627db5de420d3fdb43e2c065b58f491effb834cad16c16b18" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.855756 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"525a735387e8304627db5de420d3fdb43e2c065b58f491effb834cad16c16b18"} err="failed to get container status \"525a735387e8304627db5de420d3fdb43e2c065b58f491effb834cad16c16b18\": rpc error: code = NotFound desc = could not find container \"525a735387e8304627db5de420d3fdb43e2c065b58f491effb834cad16c16b18\": container with ID starting with 525a735387e8304627db5de420d3fdb43e2c065b58f491effb834cad16c16b18 not found: ID does not exist" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.855776 4792 scope.go:117] "RemoveContainer" containerID="482a7215f07b106d54aafd366fd817aabed5d5f936a4577645d2d03c20ffc652" Mar 09 09:34:32 crc kubenswrapper[4792]: E0309 09:34:32.855979 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"482a7215f07b106d54aafd366fd817aabed5d5f936a4577645d2d03c20ffc652\": container with ID starting with 482a7215f07b106d54aafd366fd817aabed5d5f936a4577645d2d03c20ffc652 not found: ID does not exist" containerID="482a7215f07b106d54aafd366fd817aabed5d5f936a4577645d2d03c20ffc652" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.855997 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"482a7215f07b106d54aafd366fd817aabed5d5f936a4577645d2d03c20ffc652"} err="failed to get container status \"482a7215f07b106d54aafd366fd817aabed5d5f936a4577645d2d03c20ffc652\": rpc error: code = NotFound desc = could not find container \"482a7215f07b106d54aafd366fd817aabed5d5f936a4577645d2d03c20ffc652\": container with ID starting with 482a7215f07b106d54aafd366fd817aabed5d5f936a4577645d2d03c20ffc652 not found: ID does not exist" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.856010 4792 scope.go:117] "RemoveContainer" containerID="271e3cb63dfa5cbef6ff4e64917563772ccc1da7d35974bb9c0dd927cb9ae7c8" Mar 09 09:34:32 crc kubenswrapper[4792]: E0309 09:34:32.856898 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"271e3cb63dfa5cbef6ff4e64917563772ccc1da7d35974bb9c0dd927cb9ae7c8\": container with ID starting with 271e3cb63dfa5cbef6ff4e64917563772ccc1da7d35974bb9c0dd927cb9ae7c8 not found: ID does not exist" containerID="271e3cb63dfa5cbef6ff4e64917563772ccc1da7d35974bb9c0dd927cb9ae7c8" Mar 09 09:34:32 crc kubenswrapper[4792]: I0309 09:34:32.856921 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"271e3cb63dfa5cbef6ff4e64917563772ccc1da7d35974bb9c0dd927cb9ae7c8"} err="failed to get container status \"271e3cb63dfa5cbef6ff4e64917563772ccc1da7d35974bb9c0dd927cb9ae7c8\": rpc error: code = NotFound desc = could not find container \"271e3cb63dfa5cbef6ff4e64917563772ccc1da7d35974bb9c0dd927cb9ae7c8\": container with ID starting with 271e3cb63dfa5cbef6ff4e64917563772ccc1da7d35974bb9c0dd927cb9ae7c8 not found: ID does not exist" Mar 09 09:34:33 crc kubenswrapper[4792]: I0309 09:34:33.672613 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7344810f-1120-46c5-af73-cd48477e1143" path="/var/lib/kubelet/pods/7344810f-1120-46c5-af73-cd48477e1143/volumes" Mar 09 09:34:36 crc kubenswrapper[4792]: I0309 09:34:36.787950 4792 scope.go:117] "RemoveContainer" containerID="392bee4f37cc0a3737f5478d6db8de17174299f826a2b1710fcd79aae2bc9cd1" Mar 09 09:34:38 crc kubenswrapper[4792]: I0309 09:34:38.662051 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:34:38 crc kubenswrapper[4792]: E0309 09:34:38.662679 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:34:52 crc kubenswrapper[4792]: I0309 09:34:52.664213 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:34:52 crc kubenswrapper[4792]: E0309 09:34:52.667395 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:35:04 crc kubenswrapper[4792]: I0309 09:35:04.662049 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:35:04 crc kubenswrapper[4792]: E0309 09:35:04.663558 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:35:09 crc kubenswrapper[4792]: I0309 09:35:09.047947 4792 generic.go:334] "Generic (PLEG): container finished" podID="2acce3b0-ccfb-48f6-af71-ecaa5b820874" containerID="0d2ba3a4a18a680ca58f3850cfa2b34550efc1260f9d446afb07dca113766b9b" exitCode=0 Mar 09 09:35:09 crc kubenswrapper[4792]: I0309 09:35:09.048026 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" event={"ID":"2acce3b0-ccfb-48f6-af71-ecaa5b820874","Type":"ContainerDied","Data":"0d2ba3a4a18a680ca58f3850cfa2b34550efc1260f9d446afb07dca113766b9b"} Mar 09 09:35:10 crc kubenswrapper[4792]: I0309 09:35:10.455312 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:35:10 crc kubenswrapper[4792]: I0309 09:35:10.551846 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7962k\" (UniqueName: \"kubernetes.io/projected/2acce3b0-ccfb-48f6-af71-ecaa5b820874-kube-api-access-7962k\") pod \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " Mar 09 09:35:10 crc kubenswrapper[4792]: I0309 09:35:10.551968 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-inventory\") pod \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " Mar 09 09:35:10 crc kubenswrapper[4792]: I0309 09:35:10.551990 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-bootstrap-combined-ca-bundle\") pod \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " Mar 09 09:35:10 crc kubenswrapper[4792]: I0309 09:35:10.552060 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-ssh-key-openstack-edpm-ipam\") pod \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\" (UID: \"2acce3b0-ccfb-48f6-af71-ecaa5b820874\") " Mar 09 09:35:10 crc kubenswrapper[4792]: I0309 09:35:10.557545 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2acce3b0-ccfb-48f6-af71-ecaa5b820874-kube-api-access-7962k" (OuterVolumeSpecName: "kube-api-access-7962k") pod "2acce3b0-ccfb-48f6-af71-ecaa5b820874" (UID: "2acce3b0-ccfb-48f6-af71-ecaa5b820874"). InnerVolumeSpecName "kube-api-access-7962k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:35:10 crc kubenswrapper[4792]: I0309 09:35:10.558707 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2acce3b0-ccfb-48f6-af71-ecaa5b820874" (UID: "2acce3b0-ccfb-48f6-af71-ecaa5b820874"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:35:10 crc kubenswrapper[4792]: I0309 09:35:10.580665 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2acce3b0-ccfb-48f6-af71-ecaa5b820874" (UID: "2acce3b0-ccfb-48f6-af71-ecaa5b820874"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:35:10 crc kubenswrapper[4792]: I0309 09:35:10.587774 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-inventory" (OuterVolumeSpecName: "inventory") pod "2acce3b0-ccfb-48f6-af71-ecaa5b820874" (UID: "2acce3b0-ccfb-48f6-af71-ecaa5b820874"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:35:10 crc kubenswrapper[4792]: I0309 09:35:10.653958 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7962k\" (UniqueName: \"kubernetes.io/projected/2acce3b0-ccfb-48f6-af71-ecaa5b820874-kube-api-access-7962k\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:10 crc kubenswrapper[4792]: I0309 09:35:10.654324 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:10 crc kubenswrapper[4792]: I0309 09:35:10.654341 4792 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:10 crc kubenswrapper[4792]: I0309 09:35:10.654355 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2acce3b0-ccfb-48f6-af71-ecaa5b820874-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.065911 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" event={"ID":"2acce3b0-ccfb-48f6-af71-ecaa5b820874","Type":"ContainerDied","Data":"9c3a5a2e0857c9a8941dd2d2b3310abc6c30930595487c2f4898367259decb2f"} Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.065963 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c3a5a2e0857c9a8941dd2d2b3310abc6c30930595487c2f4898367259decb2f" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.066091 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.151859 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9"] Mar 09 09:35:11 crc kubenswrapper[4792]: E0309 09:35:11.152369 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7344810f-1120-46c5-af73-cd48477e1143" containerName="extract-utilities" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.152431 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7344810f-1120-46c5-af73-cd48477e1143" containerName="extract-utilities" Mar 09 09:35:11 crc kubenswrapper[4792]: E0309 09:35:11.152490 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7344810f-1120-46c5-af73-cd48477e1143" containerName="registry-server" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.152565 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7344810f-1120-46c5-af73-cd48477e1143" containerName="registry-server" Mar 09 09:35:11 crc kubenswrapper[4792]: E0309 09:35:11.152623 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7344810f-1120-46c5-af73-cd48477e1143" containerName="extract-content" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.152673 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7344810f-1120-46c5-af73-cd48477e1143" containerName="extract-content" Mar 09 09:35:11 crc kubenswrapper[4792]: E0309 09:35:11.152738 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acce3b0-ccfb-48f6-af71-ecaa5b820874" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.152787 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acce3b0-ccfb-48f6-af71-ecaa5b820874" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.152985 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7344810f-1120-46c5-af73-cd48477e1143" containerName="registry-server" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.153050 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2acce3b0-ccfb-48f6-af71-ecaa5b820874" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.153653 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.156281 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.156811 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.156865 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.156969 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.170780 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9"] Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.263210 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj5ft\" (UniqueName: \"kubernetes.io/projected/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-kube-api-access-dj5ft\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9\" (UID: \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.263340 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9\" (UID: \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.263419 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9\" (UID: \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.365710 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9\" (UID: \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.365820 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9\" (UID: \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.365869 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj5ft\" (UniqueName: \"kubernetes.io/projected/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-kube-api-access-dj5ft\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9\" (UID: \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.370640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9\" (UID: \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.374175 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9\" (UID: \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.383312 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj5ft\" (UniqueName: \"kubernetes.io/projected/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-kube-api-access-dj5ft\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9\" (UID: \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" Mar 09 09:35:11 crc kubenswrapper[4792]: I0309 09:35:11.470680 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" Mar 09 09:35:12 crc kubenswrapper[4792]: I0309 09:35:12.020592 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9"] Mar 09 09:35:12 crc kubenswrapper[4792]: I0309 09:35:12.078787 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" event={"ID":"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9","Type":"ContainerStarted","Data":"1ea4180711328817187d8478a647cfe8f084dc5222d808a6491708da606910ad"} Mar 09 09:35:13 crc kubenswrapper[4792]: I0309 09:35:13.090808 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" event={"ID":"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9","Type":"ContainerStarted","Data":"37dcf57975b6d388083f0f16b20d85f151d1702d8448266cf9ef436edcae57e2"} Mar 09 09:35:13 crc kubenswrapper[4792]: I0309 09:35:13.117530 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" podStartSLOduration=1.710308462 podStartE2EDuration="2.11750418s" podCreationTimestamp="2026-03-09 09:35:11 +0000 UTC" firstStartedPulling="2026-03-09 09:35:12.036766255 +0000 UTC m=+1677.066967007" lastFinishedPulling="2026-03-09 09:35:12.443961973 +0000 UTC m=+1677.474162725" observedRunningTime="2026-03-09 09:35:13.114707809 +0000 UTC m=+1678.144908561" watchObservedRunningTime="2026-03-09 09:35:13.11750418 +0000 UTC m=+1678.147704932" Mar 09 09:35:16 crc kubenswrapper[4792]: I0309 09:35:16.662529 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:35:16 crc kubenswrapper[4792]: E0309 09:35:16.662998 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:35:30 crc kubenswrapper[4792]: I0309 09:35:30.662811 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:35:30 crc kubenswrapper[4792]: E0309 09:35:30.663659 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:35:36 crc kubenswrapper[4792]: I0309 09:35:36.953522 4792 scope.go:117] "RemoveContainer" containerID="913d0de3b6f9b44689feb2573bf2725dab6c0f3e7a2266f132e60d88a150a4f2" Mar 09 09:35:41 crc kubenswrapper[4792]: I0309 09:35:41.662384 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:35:41 crc kubenswrapper[4792]: E0309 09:35:41.663383 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:35:55 crc kubenswrapper[4792]: I0309 09:35:55.054310 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a9dd-account-create-update-4b996"] Mar 09 09:35:55 crc kubenswrapper[4792]: I0309 09:35:55.062548 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8rmrl"] Mar 09 09:35:55 crc kubenswrapper[4792]: I0309 09:35:55.070659 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a9dd-account-create-update-4b996"] Mar 09 09:35:55 crc kubenswrapper[4792]: I0309 09:35:55.081423 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8rmrl"] Mar 09 09:35:55 crc kubenswrapper[4792]: I0309 09:35:55.667818 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:35:55 crc kubenswrapper[4792]: E0309 09:35:55.668327 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:35:55 crc kubenswrapper[4792]: I0309 09:35:55.672579 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8" path="/var/lib/kubelet/pods/d3c50ddf-5317-4bd4-9fb2-c3a69a41b3d8/volumes" Mar 09 09:35:55 crc kubenswrapper[4792]: I0309 09:35:55.673517 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f020a8c2-5769-444a-96ee-21af21655306" path="/var/lib/kubelet/pods/f020a8c2-5769-444a-96ee-21af21655306/volumes" Mar 09 09:36:00 crc kubenswrapper[4792]: I0309 09:36:00.142227 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550816-4qnsg"] Mar 09 09:36:00 crc kubenswrapper[4792]: I0309 09:36:00.144418 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550816-4qnsg" Mar 09 09:36:00 crc kubenswrapper[4792]: I0309 09:36:00.148138 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:36:00 crc kubenswrapper[4792]: I0309 09:36:00.149135 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:36:00 crc kubenswrapper[4792]: I0309 09:36:00.151924 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550816-4qnsg"] Mar 09 09:36:00 crc kubenswrapper[4792]: I0309 09:36:00.152907 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:36:00 crc kubenswrapper[4792]: I0309 09:36:00.244330 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmk2g\" (UniqueName: \"kubernetes.io/projected/d07d2f67-727f-4c79-85c1-f24a90dd73e5-kube-api-access-kmk2g\") pod \"auto-csr-approver-29550816-4qnsg\" (UID: \"d07d2f67-727f-4c79-85c1-f24a90dd73e5\") " pod="openshift-infra/auto-csr-approver-29550816-4qnsg" Mar 09 09:36:00 crc kubenswrapper[4792]: I0309 09:36:00.346228 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmk2g\" (UniqueName: \"kubernetes.io/projected/d07d2f67-727f-4c79-85c1-f24a90dd73e5-kube-api-access-kmk2g\") pod \"auto-csr-approver-29550816-4qnsg\" (UID: \"d07d2f67-727f-4c79-85c1-f24a90dd73e5\") " pod="openshift-infra/auto-csr-approver-29550816-4qnsg" Mar 09 09:36:00 crc kubenswrapper[4792]: I0309 09:36:00.370881 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmk2g\" (UniqueName: \"kubernetes.io/projected/d07d2f67-727f-4c79-85c1-f24a90dd73e5-kube-api-access-kmk2g\") pod \"auto-csr-approver-29550816-4qnsg\" (UID: \"d07d2f67-727f-4c79-85c1-f24a90dd73e5\") " pod="openshift-infra/auto-csr-approver-29550816-4qnsg" Mar 09 09:36:00 crc kubenswrapper[4792]: I0309 09:36:00.464739 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550816-4qnsg" Mar 09 09:36:00 crc kubenswrapper[4792]: I0309 09:36:00.893573 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550816-4qnsg"] Mar 09 09:36:01 crc kubenswrapper[4792]: I0309 09:36:01.520511 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550816-4qnsg" event={"ID":"d07d2f67-727f-4c79-85c1-f24a90dd73e5","Type":"ContainerStarted","Data":"8c764a0f9cfdf6c8bad16948eae7b3cabf5faf3961d2feea3c8cfee61460ab04"} Mar 09 09:36:02 crc kubenswrapper[4792]: I0309 09:36:02.051815 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-s8k8r"] Mar 09 09:36:02 crc kubenswrapper[4792]: I0309 09:36:02.074610 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-eaf0-account-create-update-b2b57"] Mar 09 09:36:02 crc kubenswrapper[4792]: I0309 09:36:02.092423 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-eaf0-account-create-update-b2b57"] Mar 09 09:36:02 crc kubenswrapper[4792]: I0309 09:36:02.105946 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-s8k8r"] Mar 09 09:36:02 crc kubenswrapper[4792]: I0309 09:36:02.116761 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-zzfgc"] Mar 09 09:36:02 crc kubenswrapper[4792]: I0309 09:36:02.126595 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-zzfgc"] Mar 09 09:36:02 crc kubenswrapper[4792]: I0309 09:36:02.136920 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-71f9-account-create-update-5wltv"] Mar 09 09:36:02 crc kubenswrapper[4792]: I0309 09:36:02.148202 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-71f9-account-create-update-5wltv"] Mar 09 09:36:02 crc kubenswrapper[4792]: I0309 09:36:02.530280 4792 generic.go:334] "Generic (PLEG): container finished" podID="d07d2f67-727f-4c79-85c1-f24a90dd73e5" containerID="dabbbeb457dc928eb9c1c64fecfffb01b1a1347ba11a1fc7130704452148f72d" exitCode=0 Mar 09 09:36:02 crc kubenswrapper[4792]: I0309 09:36:02.530461 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550816-4qnsg" event={"ID":"d07d2f67-727f-4c79-85c1-f24a90dd73e5","Type":"ContainerDied","Data":"dabbbeb457dc928eb9c1c64fecfffb01b1a1347ba11a1fc7130704452148f72d"} Mar 09 09:36:03 crc kubenswrapper[4792]: I0309 09:36:03.675282 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144" path="/var/lib/kubelet/pods/0e9e2cd8-39ac-46f3-aef1-3c4a81b4a144/volumes" Mar 09 09:36:03 crc kubenswrapper[4792]: I0309 09:36:03.676194 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f813716-691b-4f49-bbb6-e0486b2d2b31" path="/var/lib/kubelet/pods/0f813716-691b-4f49-bbb6-e0486b2d2b31/volumes" Mar 09 09:36:03 crc kubenswrapper[4792]: I0309 09:36:03.676823 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="856fa1f5-ef4a-4122-9a28-cabfe353eaeb" path="/var/lib/kubelet/pods/856fa1f5-ef4a-4122-9a28-cabfe353eaeb/volumes" Mar 09 09:36:03 crc kubenswrapper[4792]: I0309 09:36:03.679606 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd58f0df-aee5-44be-ae2a-33033696a043" path="/var/lib/kubelet/pods/cd58f0df-aee5-44be-ae2a-33033696a043/volumes" Mar 09 09:36:03 crc kubenswrapper[4792]: I0309 09:36:03.840020 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550816-4qnsg" Mar 09 09:36:04 crc kubenswrapper[4792]: I0309 09:36:04.011443 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmk2g\" (UniqueName: \"kubernetes.io/projected/d07d2f67-727f-4c79-85c1-f24a90dd73e5-kube-api-access-kmk2g\") pod \"d07d2f67-727f-4c79-85c1-f24a90dd73e5\" (UID: \"d07d2f67-727f-4c79-85c1-f24a90dd73e5\") " Mar 09 09:36:04 crc kubenswrapper[4792]: I0309 09:36:04.018560 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d07d2f67-727f-4c79-85c1-f24a90dd73e5-kube-api-access-kmk2g" (OuterVolumeSpecName: "kube-api-access-kmk2g") pod "d07d2f67-727f-4c79-85c1-f24a90dd73e5" (UID: "d07d2f67-727f-4c79-85c1-f24a90dd73e5"). InnerVolumeSpecName "kube-api-access-kmk2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:36:04 crc kubenswrapper[4792]: I0309 09:36:04.115113 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmk2g\" (UniqueName: \"kubernetes.io/projected/d07d2f67-727f-4c79-85c1-f24a90dd73e5-kube-api-access-kmk2g\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:04 crc kubenswrapper[4792]: I0309 09:36:04.548628 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550816-4qnsg" event={"ID":"d07d2f67-727f-4c79-85c1-f24a90dd73e5","Type":"ContainerDied","Data":"8c764a0f9cfdf6c8bad16948eae7b3cabf5faf3961d2feea3c8cfee61460ab04"} Mar 09 09:36:04 crc kubenswrapper[4792]: I0309 09:36:04.548674 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c764a0f9cfdf6c8bad16948eae7b3cabf5faf3961d2feea3c8cfee61460ab04" Mar 09 09:36:04 crc kubenswrapper[4792]: I0309 09:36:04.548710 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550816-4qnsg" Mar 09 09:36:04 crc kubenswrapper[4792]: I0309 09:36:04.904206 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550810-8zk6b"] Mar 09 09:36:04 crc kubenswrapper[4792]: I0309 09:36:04.940770 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550810-8zk6b"] Mar 09 09:36:05 crc kubenswrapper[4792]: I0309 09:36:05.672935 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac96eb08-404b-4da1-aecd-ae86494f9165" path="/var/lib/kubelet/pods/ac96eb08-404b-4da1-aecd-ae86494f9165/volumes" Mar 09 09:36:08 crc kubenswrapper[4792]: I0309 09:36:08.663149 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:36:08 crc kubenswrapper[4792]: E0309 09:36:08.663657 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:36:15 crc kubenswrapper[4792]: I0309 09:36:15.031810 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zknkp"] Mar 09 09:36:15 crc kubenswrapper[4792]: I0309 09:36:15.040713 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zknkp"] Mar 09 09:36:15 crc kubenswrapper[4792]: I0309 09:36:15.673537 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baba995c-6694-4351-9a7d-3078f73581e8" path="/var/lib/kubelet/pods/baba995c-6694-4351-9a7d-3078f73581e8/volumes" Mar 09 09:36:21 crc kubenswrapper[4792]: I0309 09:36:21.046655 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-tknqj"] Mar 09 09:36:21 crc kubenswrapper[4792]: I0309 09:36:21.055526 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-tknqj"] Mar 09 09:36:21 crc kubenswrapper[4792]: I0309 09:36:21.673008 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc711fe-a152-4902-af60-09a6bed9344a" path="/var/lib/kubelet/pods/efc711fe-a152-4902-af60-09a6bed9344a/volumes" Mar 09 09:36:23 crc kubenswrapper[4792]: I0309 09:36:23.662207 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:36:23 crc kubenswrapper[4792]: E0309 09:36:23.662750 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:36:31 crc kubenswrapper[4792]: I0309 09:36:31.034024 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-hwgng"] Mar 09 09:36:31 crc kubenswrapper[4792]: I0309 09:36:31.045473 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-hwgng"] Mar 09 09:36:31 crc kubenswrapper[4792]: I0309 09:36:31.671875 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dda26326-39c5-4d44-9b41-7b9cf206b4ac" path="/var/lib/kubelet/pods/dda26326-39c5-4d44-9b41-7b9cf206b4ac/volumes" Mar 09 09:36:34 crc kubenswrapper[4792]: I0309 09:36:34.031831 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-9w295"] Mar 09 09:36:34 crc kubenswrapper[4792]: I0309 09:36:34.039331 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8a45-account-create-update-hss2q"] Mar 09 09:36:34 crc kubenswrapper[4792]: I0309 09:36:34.046885 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8a45-account-create-update-hss2q"] Mar 09 09:36:34 crc kubenswrapper[4792]: I0309 09:36:34.054956 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-9w295"] Mar 09 09:36:35 crc kubenswrapper[4792]: I0309 09:36:35.029608 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-kcl6t"] Mar 09 09:36:35 crc kubenswrapper[4792]: I0309 09:36:35.041395 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-97cb-account-create-update-kfnzz"] Mar 09 09:36:35 crc kubenswrapper[4792]: I0309 09:36:35.054720 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-97cb-account-create-update-kfnzz"] Mar 09 09:36:35 crc kubenswrapper[4792]: I0309 09:36:35.065234 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a92c-account-create-update-5547j"] Mar 09 09:36:35 crc kubenswrapper[4792]: I0309 09:36:35.072443 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-kcl6t"] Mar 09 09:36:35 crc kubenswrapper[4792]: I0309 09:36:35.079967 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a92c-account-create-update-5547j"] Mar 09 09:36:35 crc kubenswrapper[4792]: I0309 09:36:35.668518 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:36:35 crc kubenswrapper[4792]: E0309 09:36:35.668784 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:36:35 crc kubenswrapper[4792]: I0309 09:36:35.672411 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c7264c-a882-4413-8e6a-c42ca77b7153" path="/var/lib/kubelet/pods/21c7264c-a882-4413-8e6a-c42ca77b7153/volumes" Mar 09 09:36:35 crc kubenswrapper[4792]: I0309 09:36:35.673180 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ba4128e-9cd3-4ec9-a955-230ca12f4fed" path="/var/lib/kubelet/pods/7ba4128e-9cd3-4ec9-a955-230ca12f4fed/volumes" Mar 09 09:36:35 crc kubenswrapper[4792]: I0309 09:36:35.673775 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93a03de8-9516-4678-aafa-e4ce0b5c6e2b" path="/var/lib/kubelet/pods/93a03de8-9516-4678-aafa-e4ce0b5c6e2b/volumes" Mar 09 09:36:35 crc kubenswrapper[4792]: I0309 09:36:35.674508 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c" path="/var/lib/kubelet/pods/ced6ea9f-bc40-4e2d-a7ba-6f1ef808548c/volumes" Mar 09 09:36:35 crc kubenswrapper[4792]: I0309 09:36:35.675668 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5b2f92-52e4-4f31-a3a5-75f94a5cac77" path="/var/lib/kubelet/pods/ff5b2f92-52e4-4f31-a3a5-75f94a5cac77/volumes" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.000890 4792 scope.go:117] "RemoveContainer" containerID="b24252fb970fc808687181defaae9df9662e3593786a5d40acd86f1a852d7188" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.024028 4792 scope.go:117] "RemoveContainer" containerID="1ce2f0989d353fbf89243b76a683bd93a6057b364bb790ff3c931ade1fc5a8c5" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.081762 4792 scope.go:117] "RemoveContainer" containerID="1faa98db718e8b2c1afb8d6384f790c449345fd1741e2bfac48a559c6b9e1c5a" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.129134 4792 scope.go:117] "RemoveContainer" containerID="c92ddaa7c82d4d86f3f797b790568d0792df8e9869f6b5a008140042de6024ca" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.179091 4792 scope.go:117] "RemoveContainer" containerID="2e3d6b7392ac5cc440f301bb1dfbdd7f36a6534e039d0779198434011c366841" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.230495 4792 scope.go:117] "RemoveContainer" containerID="7cd96cf84084a79b039ba22f8c9a2a17dfae2180410aa03d3bb80fc4d555f917" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.249580 4792 scope.go:117] "RemoveContainer" containerID="9b851087551136349181e1004ac0ab3cd58c966eb6ba358b55755d0b33558b72" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.268270 4792 scope.go:117] "RemoveContainer" containerID="434f86a5bfe02916bbcdda6126e7faef064706b34062cc52af17a905e7d4557c" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.289651 4792 scope.go:117] "RemoveContainer" containerID="e690e01c1bda4fa80fa320895f2ddbaa3a88f6ac109e0fae86eab32e4b37c482" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.310424 4792 scope.go:117] "RemoveContainer" containerID="4924ec075293347d1f95edfef61d222d2858972f95097e529d02087fead3cdb3" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.334905 4792 scope.go:117] "RemoveContainer" containerID="5fd8b27c01a11def787b3e7d0415a5f4c7b3f3fe4c170c7692b8cac930f3e052" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.355702 4792 scope.go:117] "RemoveContainer" containerID="1626f0438cbc7e0d86aae5e1aa7304448648a0c8624c8d7f1a59e46cc1c74e68" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.377886 4792 scope.go:117] "RemoveContainer" containerID="e71af4a647d960250e360fd6ab2fe5dda9c3ec25e26b9e22634df30d9890218a" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.433349 4792 scope.go:117] "RemoveContainer" containerID="0aad0fe7657f7fc5d17dac4f6c6fbf72129ee4b6a2e989da744288b54468e4ba" Mar 09 09:36:37 crc kubenswrapper[4792]: I0309 09:36:37.454418 4792 scope.go:117] "RemoveContainer" containerID="3979e6dbc46490911e92e4ca52d70831161c415e9a0d2b8d7030d57ba6b3b7b5" Mar 09 09:36:40 crc kubenswrapper[4792]: I0309 09:36:40.027222 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-267hg"] Mar 09 09:36:40 crc kubenswrapper[4792]: I0309 09:36:40.040195 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-267hg"] Mar 09 09:36:41 crc kubenswrapper[4792]: I0309 09:36:41.673210 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dfbc031-90cc-42ac-b543-febe1469f699" path="/var/lib/kubelet/pods/3dfbc031-90cc-42ac-b543-febe1469f699/volumes" Mar 09 09:36:46 crc kubenswrapper[4792]: I0309 09:36:46.902583 4792 generic.go:334] "Generic (PLEG): container finished" podID="7e3791a8-646a-4beb-a1b5-fa3390e5a9c9" containerID="37dcf57975b6d388083f0f16b20d85f151d1702d8448266cf9ef436edcae57e2" exitCode=0 Mar 09 09:36:46 crc kubenswrapper[4792]: I0309 09:36:46.902696 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" event={"ID":"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9","Type":"ContainerDied","Data":"37dcf57975b6d388083f0f16b20d85f151d1702d8448266cf9ef436edcae57e2"} Mar 09 09:36:48 crc kubenswrapper[4792]: I0309 09:36:48.346489 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" Mar 09 09:36:48 crc kubenswrapper[4792]: I0309 09:36:48.503058 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj5ft\" (UniqueName: \"kubernetes.io/projected/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-kube-api-access-dj5ft\") pod \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\" (UID: \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\") " Mar 09 09:36:48 crc kubenswrapper[4792]: I0309 09:36:48.503153 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-inventory\") pod \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\" (UID: \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\") " Mar 09 09:36:48 crc kubenswrapper[4792]: I0309 09:36:48.503351 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-ssh-key-openstack-edpm-ipam\") pod \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\" (UID: \"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9\") " Mar 09 09:36:48 crc kubenswrapper[4792]: I0309 09:36:48.512933 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-kube-api-access-dj5ft" (OuterVolumeSpecName: "kube-api-access-dj5ft") pod "7e3791a8-646a-4beb-a1b5-fa3390e5a9c9" (UID: "7e3791a8-646a-4beb-a1b5-fa3390e5a9c9"). InnerVolumeSpecName "kube-api-access-dj5ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:36:48 crc kubenswrapper[4792]: I0309 09:36:48.532092 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7e3791a8-646a-4beb-a1b5-fa3390e5a9c9" (UID: "7e3791a8-646a-4beb-a1b5-fa3390e5a9c9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:36:48 crc kubenswrapper[4792]: I0309 09:36:48.549305 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-inventory" (OuterVolumeSpecName: "inventory") pod "7e3791a8-646a-4beb-a1b5-fa3390e5a9c9" (UID: "7e3791a8-646a-4beb-a1b5-fa3390e5a9c9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:36:48 crc kubenswrapper[4792]: I0309 09:36:48.605385 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:48 crc kubenswrapper[4792]: I0309 09:36:48.605587 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj5ft\" (UniqueName: \"kubernetes.io/projected/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-kube-api-access-dj5ft\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:48 crc kubenswrapper[4792]: I0309 09:36:48.605666 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:48 crc kubenswrapper[4792]: I0309 09:36:48.920717 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" event={"ID":"7e3791a8-646a-4beb-a1b5-fa3390e5a9c9","Type":"ContainerDied","Data":"1ea4180711328817187d8478a647cfe8f084dc5222d808a6491708da606910ad"} Mar 09 09:36:48 crc kubenswrapper[4792]: I0309 09:36:48.920765 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ea4180711328817187d8478a647cfe8f084dc5222d808a6491708da606910ad" Mar 09 09:36:48 crc kubenswrapper[4792]: I0309 09:36:48.920763 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.007944 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt"] Mar 09 09:36:49 crc kubenswrapper[4792]: E0309 09:36:49.008472 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d07d2f67-727f-4c79-85c1-f24a90dd73e5" containerName="oc" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.008496 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07d2f67-727f-4c79-85c1-f24a90dd73e5" containerName="oc" Mar 09 09:36:49 crc kubenswrapper[4792]: E0309 09:36:49.008525 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e3791a8-646a-4beb-a1b5-fa3390e5a9c9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.008537 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e3791a8-646a-4beb-a1b5-fa3390e5a9c9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.008773 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e3791a8-646a-4beb-a1b5-fa3390e5a9c9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.008818 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d07d2f67-727f-4c79-85c1-f24a90dd73e5" containerName="oc" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.009578 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.016744 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.016950 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.016745 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.017243 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.021970 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt"] Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.118797 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqc8w\" (UniqueName: \"kubernetes.io/projected/dcfea667-10ed-44bd-8bf0-c41767b68d61-kube-api-access-gqc8w\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt\" (UID: \"dcfea667-10ed-44bd-8bf0-c41767b68d61\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.118913 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dcfea667-10ed-44bd-8bf0-c41767b68d61-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt\" (UID: \"dcfea667-10ed-44bd-8bf0-c41767b68d61\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.118944 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcfea667-10ed-44bd-8bf0-c41767b68d61-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt\" (UID: \"dcfea667-10ed-44bd-8bf0-c41767b68d61\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.220175 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcfea667-10ed-44bd-8bf0-c41767b68d61-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt\" (UID: \"dcfea667-10ed-44bd-8bf0-c41767b68d61\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.220319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqc8w\" (UniqueName: \"kubernetes.io/projected/dcfea667-10ed-44bd-8bf0-c41767b68d61-kube-api-access-gqc8w\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt\" (UID: \"dcfea667-10ed-44bd-8bf0-c41767b68d61\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.220367 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dcfea667-10ed-44bd-8bf0-c41767b68d61-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt\" (UID: \"dcfea667-10ed-44bd-8bf0-c41767b68d61\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.226754 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcfea667-10ed-44bd-8bf0-c41767b68d61-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt\" (UID: \"dcfea667-10ed-44bd-8bf0-c41767b68d61\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.227440 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dcfea667-10ed-44bd-8bf0-c41767b68d61-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt\" (UID: \"dcfea667-10ed-44bd-8bf0-c41767b68d61\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.246769 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqc8w\" (UniqueName: \"kubernetes.io/projected/dcfea667-10ed-44bd-8bf0-c41767b68d61-kube-api-access-gqc8w\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt\" (UID: \"dcfea667-10ed-44bd-8bf0-c41767b68d61\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.329434 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.663079 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:36:49 crc kubenswrapper[4792]: E0309 09:36:49.663497 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:36:49 crc kubenswrapper[4792]: I0309 09:36:49.914520 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt"] Mar 09 09:36:49 crc kubenswrapper[4792]: W0309 09:36:49.928197 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcfea667_10ed_44bd_8bf0_c41767b68d61.slice/crio-ba4c117e038aa8df856983a353f489ea79c162ea04108bde8c3c065513931d59 WatchSource:0}: Error finding container ba4c117e038aa8df856983a353f489ea79c162ea04108bde8c3c065513931d59: Status 404 returned error can't find the container with id ba4c117e038aa8df856983a353f489ea79c162ea04108bde8c3c065513931d59 Mar 09 09:36:50 crc kubenswrapper[4792]: I0309 09:36:50.944850 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" event={"ID":"dcfea667-10ed-44bd-8bf0-c41767b68d61","Type":"ContainerStarted","Data":"76e6e59a6cbe9324a897226295db80b5750be1f56f01c7c67683d9eb2261880a"} Mar 09 09:36:50 crc kubenswrapper[4792]: I0309 09:36:50.945222 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" event={"ID":"dcfea667-10ed-44bd-8bf0-c41767b68d61","Type":"ContainerStarted","Data":"ba4c117e038aa8df856983a353f489ea79c162ea04108bde8c3c065513931d59"} Mar 09 09:36:50 crc kubenswrapper[4792]: I0309 09:36:50.966531 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" podStartSLOduration=2.428610745 podStartE2EDuration="2.966510161s" podCreationTimestamp="2026-03-09 09:36:48 +0000 UTC" firstStartedPulling="2026-03-09 09:36:49.935756777 +0000 UTC m=+1774.965957529" lastFinishedPulling="2026-03-09 09:36:50.473656193 +0000 UTC m=+1775.503856945" observedRunningTime="2026-03-09 09:36:50.960110357 +0000 UTC m=+1775.990311109" watchObservedRunningTime="2026-03-09 09:36:50.966510161 +0000 UTC m=+1775.996710913" Mar 09 09:36:55 crc kubenswrapper[4792]: I0309 09:36:55.980848 4792 generic.go:334] "Generic (PLEG): container finished" podID="dcfea667-10ed-44bd-8bf0-c41767b68d61" containerID="76e6e59a6cbe9324a897226295db80b5750be1f56f01c7c67683d9eb2261880a" exitCode=0 Mar 09 09:36:55 crc kubenswrapper[4792]: I0309 09:36:55.980930 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" event={"ID":"dcfea667-10ed-44bd-8bf0-c41767b68d61","Type":"ContainerDied","Data":"76e6e59a6cbe9324a897226295db80b5750be1f56f01c7c67683d9eb2261880a"} Mar 09 09:36:57 crc kubenswrapper[4792]: I0309 09:36:57.505271 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" Mar 09 09:36:57 crc kubenswrapper[4792]: I0309 09:36:57.682391 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dcfea667-10ed-44bd-8bf0-c41767b68d61-ssh-key-openstack-edpm-ipam\") pod \"dcfea667-10ed-44bd-8bf0-c41767b68d61\" (UID: \"dcfea667-10ed-44bd-8bf0-c41767b68d61\") " Mar 09 09:36:57 crc kubenswrapper[4792]: I0309 09:36:57.682574 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqc8w\" (UniqueName: \"kubernetes.io/projected/dcfea667-10ed-44bd-8bf0-c41767b68d61-kube-api-access-gqc8w\") pod \"dcfea667-10ed-44bd-8bf0-c41767b68d61\" (UID: \"dcfea667-10ed-44bd-8bf0-c41767b68d61\") " Mar 09 09:36:57 crc kubenswrapper[4792]: I0309 09:36:57.683372 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcfea667-10ed-44bd-8bf0-c41767b68d61-inventory\") pod \"dcfea667-10ed-44bd-8bf0-c41767b68d61\" (UID: \"dcfea667-10ed-44bd-8bf0-c41767b68d61\") " Mar 09 09:36:57 crc kubenswrapper[4792]: I0309 09:36:57.695302 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcfea667-10ed-44bd-8bf0-c41767b68d61-kube-api-access-gqc8w" (OuterVolumeSpecName: "kube-api-access-gqc8w") pod "dcfea667-10ed-44bd-8bf0-c41767b68d61" (UID: "dcfea667-10ed-44bd-8bf0-c41767b68d61"). InnerVolumeSpecName "kube-api-access-gqc8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:36:57 crc kubenswrapper[4792]: I0309 09:36:57.710561 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcfea667-10ed-44bd-8bf0-c41767b68d61-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "dcfea667-10ed-44bd-8bf0-c41767b68d61" (UID: "dcfea667-10ed-44bd-8bf0-c41767b68d61"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:36:57 crc kubenswrapper[4792]: I0309 09:36:57.711460 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcfea667-10ed-44bd-8bf0-c41767b68d61-inventory" (OuterVolumeSpecName: "inventory") pod "dcfea667-10ed-44bd-8bf0-c41767b68d61" (UID: "dcfea667-10ed-44bd-8bf0-c41767b68d61"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:36:57 crc kubenswrapper[4792]: I0309 09:36:57.785709 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqc8w\" (UniqueName: \"kubernetes.io/projected/dcfea667-10ed-44bd-8bf0-c41767b68d61-kube-api-access-gqc8w\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:57 crc kubenswrapper[4792]: I0309 09:36:57.785750 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcfea667-10ed-44bd-8bf0-c41767b68d61-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:57 crc kubenswrapper[4792]: I0309 09:36:57.785765 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/dcfea667-10ed-44bd-8bf0-c41767b68d61-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:57 crc kubenswrapper[4792]: I0309 09:36:57.997632 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" event={"ID":"dcfea667-10ed-44bd-8bf0-c41767b68d61","Type":"ContainerDied","Data":"ba4c117e038aa8df856983a353f489ea79c162ea04108bde8c3c065513931d59"} Mar 09 09:36:57 crc kubenswrapper[4792]: I0309 09:36:57.998100 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba4c117e038aa8df856983a353f489ea79c162ea04108bde8c3c065513931d59" Mar 09 09:36:57 crc kubenswrapper[4792]: I0309 09:36:57.997859 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.091706 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm"] Mar 09 09:36:58 crc kubenswrapper[4792]: E0309 09:36:58.092108 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcfea667-10ed-44bd-8bf0-c41767b68d61" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.092130 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcfea667-10ed-44bd-8bf0-c41767b68d61" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.092315 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcfea667-10ed-44bd-8bf0-c41767b68d61" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.092884 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.095220 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.095310 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.095557 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.097160 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.109340 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm"] Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.191377 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7mbm\" (UID: \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.191501 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7mbm\" (UID: \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.191583 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d96m\" (UniqueName: \"kubernetes.io/projected/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-kube-api-access-2d96m\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7mbm\" (UID: \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.293581 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7mbm\" (UID: \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.293703 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d96m\" (UniqueName: \"kubernetes.io/projected/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-kube-api-access-2d96m\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7mbm\" (UID: \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.293760 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7mbm\" (UID: \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.323033 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7mbm\" (UID: \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.323642 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7mbm\" (UID: \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.340388 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d96m\" (UniqueName: \"kubernetes.io/projected/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-kube-api-access-2d96m\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7mbm\" (UID: \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" Mar 09 09:36:58 crc kubenswrapper[4792]: I0309 09:36:58.417649 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" Mar 09 09:36:59 crc kubenswrapper[4792]: I0309 09:36:59.095180 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm"] Mar 09 09:36:59 crc kubenswrapper[4792]: I0309 09:36:59.103383 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:37:00 crc kubenswrapper[4792]: I0309 09:37:00.013276 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" event={"ID":"30cac2f1-8f76-42a8-a5dd-e828ed1d7430","Type":"ContainerStarted","Data":"2765bf8e7416ce9a5c4bc8417a0fd5123045d8266916cf4788f20e8d189e18c7"} Mar 09 09:37:00 crc kubenswrapper[4792]: I0309 09:37:00.013323 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" event={"ID":"30cac2f1-8f76-42a8-a5dd-e828ed1d7430","Type":"ContainerStarted","Data":"52b078be12db7451e5bbbd0a694df3c73c31ce0439c4ff990be98625b00a0f7b"} Mar 09 09:37:00 crc kubenswrapper[4792]: I0309 09:37:00.032534 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" podStartSLOduration=1.571495973 podStartE2EDuration="2.032512994s" podCreationTimestamp="2026-03-09 09:36:58 +0000 UTC" firstStartedPulling="2026-03-09 09:36:59.103117881 +0000 UTC m=+1784.133318633" lastFinishedPulling="2026-03-09 09:36:59.564134892 +0000 UTC m=+1784.594335654" observedRunningTime="2026-03-09 09:37:00.026619314 +0000 UTC m=+1785.056820086" watchObservedRunningTime="2026-03-09 09:37:00.032512994 +0000 UTC m=+1785.062713746" Mar 09 09:37:02 crc kubenswrapper[4792]: I0309 09:37:02.662745 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:37:02 crc kubenswrapper[4792]: E0309 09:37:02.663480 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:37:13 crc kubenswrapper[4792]: I0309 09:37:13.034722 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-rwg59"] Mar 09 09:37:13 crc kubenswrapper[4792]: I0309 09:37:13.048918 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-rwg59"] Mar 09 09:37:13 crc kubenswrapper[4792]: I0309 09:37:13.674253 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77fab42c-43ad-48e7-bbe0-d3698e3aea96" path="/var/lib/kubelet/pods/77fab42c-43ad-48e7-bbe0-d3698e3aea96/volumes" Mar 09 09:37:14 crc kubenswrapper[4792]: I0309 09:37:14.028904 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fcfwq"] Mar 09 09:37:14 crc kubenswrapper[4792]: I0309 09:37:14.037127 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-k9lg4"] Mar 09 09:37:14 crc kubenswrapper[4792]: I0309 09:37:14.048780 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fcfwq"] Mar 09 09:37:14 crc kubenswrapper[4792]: I0309 09:37:14.056184 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-k9lg4"] Mar 09 09:37:14 crc kubenswrapper[4792]: I0309 09:37:14.662456 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:37:14 crc kubenswrapper[4792]: E0309 09:37:14.662651 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:37:15 crc kubenswrapper[4792]: I0309 09:37:15.673868 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90f2a896-0e4a-4a08-b6be-16ba96a75fcf" path="/var/lib/kubelet/pods/90f2a896-0e4a-4a08-b6be-16ba96a75fcf/volumes" Mar 09 09:37:15 crc kubenswrapper[4792]: I0309 09:37:15.675214 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf438875-5be3-489e-8626-da673a088bef" path="/var/lib/kubelet/pods/bf438875-5be3-489e-8626-da673a088bef/volumes" Mar 09 09:37:25 crc kubenswrapper[4792]: I0309 09:37:25.676403 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:37:25 crc kubenswrapper[4792]: E0309 09:37:25.685933 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:37:32 crc kubenswrapper[4792]: I0309 09:37:32.056259 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xxwfp"] Mar 09 09:37:32 crc kubenswrapper[4792]: I0309 09:37:32.063411 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xxwfp"] Mar 09 09:37:33 crc kubenswrapper[4792]: I0309 09:37:33.687965 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0e4182c-e32b-443d-85ea-0e5737c3fd1e" path="/var/lib/kubelet/pods/d0e4182c-e32b-443d-85ea-0e5737c3fd1e/volumes" Mar 09 09:37:36 crc kubenswrapper[4792]: I0309 09:37:36.033388 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-gkkjx"] Mar 09 09:37:36 crc kubenswrapper[4792]: I0309 09:37:36.045022 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-gkkjx"] Mar 09 09:37:37 crc kubenswrapper[4792]: I0309 09:37:37.673505 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f4bcabb-7f34-423b-a653-bd785eba0978" path="/var/lib/kubelet/pods/6f4bcabb-7f34-423b-a653-bd785eba0978/volumes" Mar 09 09:37:37 crc kubenswrapper[4792]: I0309 09:37:37.700357 4792 scope.go:117] "RemoveContainer" containerID="5808485cd53cce072aa9d3ac5b5d61bc5aa0dd2076fbbbf7beccf8213d4211c2" Mar 09 09:37:37 crc kubenswrapper[4792]: I0309 09:37:37.729518 4792 scope.go:117] "RemoveContainer" containerID="61a61a611bc9aa8ff5b2a1f63fad4a53679b2714420f1ba9bcab5a31648848a4" Mar 09 09:37:37 crc kubenswrapper[4792]: I0309 09:37:37.793332 4792 scope.go:117] "RemoveContainer" containerID="a1179b82779e6892111d8b44fd02a95030fabdd17e9af21a4abe98e7fa23b850" Mar 09 09:37:37 crc kubenswrapper[4792]: I0309 09:37:37.838861 4792 scope.go:117] "RemoveContainer" containerID="6a30ed979dd2c1330f83e0f87a1fb178c826aabe2ece0d8af575e24bfbe65594" Mar 09 09:37:37 crc kubenswrapper[4792]: I0309 09:37:37.884627 4792 scope.go:117] "RemoveContainer" containerID="b6991cc23ca69c37a721663b4fc8f9dbe832d06c036627daa7ea233a1170dd0c" Mar 09 09:37:37 crc kubenswrapper[4792]: I0309 09:37:37.915016 4792 scope.go:117] "RemoveContainer" containerID="77d22be3d55c6ddd33e6d3b099584d93b83a1675c5c974830e1becbc6abb866e" Mar 09 09:37:38 crc kubenswrapper[4792]: I0309 09:37:38.662343 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:37:38 crc kubenswrapper[4792]: E0309 09:37:38.662543 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:37:41 crc kubenswrapper[4792]: I0309 09:37:41.380417 4792 generic.go:334] "Generic (PLEG): container finished" podID="30cac2f1-8f76-42a8-a5dd-e828ed1d7430" containerID="2765bf8e7416ce9a5c4bc8417a0fd5123045d8266916cf4788f20e8d189e18c7" exitCode=0 Mar 09 09:37:41 crc kubenswrapper[4792]: I0309 09:37:41.380598 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" event={"ID":"30cac2f1-8f76-42a8-a5dd-e828ed1d7430","Type":"ContainerDied","Data":"2765bf8e7416ce9a5c4bc8417a0fd5123045d8266916cf4788f20e8d189e18c7"} Mar 09 09:37:42 crc kubenswrapper[4792]: I0309 09:37:42.847575 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" Mar 09 09:37:42 crc kubenswrapper[4792]: I0309 09:37:42.999020 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d96m\" (UniqueName: \"kubernetes.io/projected/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-kube-api-access-2d96m\") pod \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\" (UID: \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\") " Mar 09 09:37:42 crc kubenswrapper[4792]: I0309 09:37:42.999220 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-ssh-key-openstack-edpm-ipam\") pod \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\" (UID: \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\") " Mar 09 09:37:42 crc kubenswrapper[4792]: I0309 09:37:42.999258 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-inventory\") pod \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\" (UID: \"30cac2f1-8f76-42a8-a5dd-e828ed1d7430\") " Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.013347 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-kube-api-access-2d96m" (OuterVolumeSpecName: "kube-api-access-2d96m") pod "30cac2f1-8f76-42a8-a5dd-e828ed1d7430" (UID: "30cac2f1-8f76-42a8-a5dd-e828ed1d7430"). InnerVolumeSpecName "kube-api-access-2d96m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.022642 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "30cac2f1-8f76-42a8-a5dd-e828ed1d7430" (UID: "30cac2f1-8f76-42a8-a5dd-e828ed1d7430"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.037841 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-inventory" (OuterVolumeSpecName: "inventory") pod "30cac2f1-8f76-42a8-a5dd-e828ed1d7430" (UID: "30cac2f1-8f76-42a8-a5dd-e828ed1d7430"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.100923 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.101305 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.101410 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d96m\" (UniqueName: \"kubernetes.io/projected/30cac2f1-8f76-42a8-a5dd-e828ed1d7430-kube-api-access-2d96m\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.398565 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" event={"ID":"30cac2f1-8f76-42a8-a5dd-e828ed1d7430","Type":"ContainerDied","Data":"52b078be12db7451e5bbbd0a694df3c73c31ce0439c4ff990be98625b00a0f7b"} Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.398826 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52b078be12db7451e5bbbd0a694df3c73c31ce0439c4ff990be98625b00a0f7b" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.398605 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.487401 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r"] Mar 09 09:37:43 crc kubenswrapper[4792]: E0309 09:37:43.487762 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30cac2f1-8f76-42a8-a5dd-e828ed1d7430" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.487785 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="30cac2f1-8f76-42a8-a5dd-e828ed1d7430" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.487977 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="30cac2f1-8f76-42a8-a5dd-e828ed1d7430" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.488563 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.494135 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.494144 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.494522 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.500114 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.521252 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r"] Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.609198 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9659141d-9a06-4963-b1ec-5982b06ade1c-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r\" (UID: \"9659141d-9a06-4963-b1ec-5982b06ade1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.609302 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph6kj\" (UniqueName: \"kubernetes.io/projected/9659141d-9a06-4963-b1ec-5982b06ade1c-kube-api-access-ph6kj\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r\" (UID: \"9659141d-9a06-4963-b1ec-5982b06ade1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.609508 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9659141d-9a06-4963-b1ec-5982b06ade1c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r\" (UID: \"9659141d-9a06-4963-b1ec-5982b06ade1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.711216 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph6kj\" (UniqueName: \"kubernetes.io/projected/9659141d-9a06-4963-b1ec-5982b06ade1c-kube-api-access-ph6kj\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r\" (UID: \"9659141d-9a06-4963-b1ec-5982b06ade1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.711347 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9659141d-9a06-4963-b1ec-5982b06ade1c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r\" (UID: \"9659141d-9a06-4963-b1ec-5982b06ade1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.711464 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9659141d-9a06-4963-b1ec-5982b06ade1c-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r\" (UID: \"9659141d-9a06-4963-b1ec-5982b06ade1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.717926 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9659141d-9a06-4963-b1ec-5982b06ade1c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r\" (UID: \"9659141d-9a06-4963-b1ec-5982b06ade1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.718313 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9659141d-9a06-4963-b1ec-5982b06ade1c-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r\" (UID: \"9659141d-9a06-4963-b1ec-5982b06ade1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.730307 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph6kj\" (UniqueName: \"kubernetes.io/projected/9659141d-9a06-4963-b1ec-5982b06ade1c-kube-api-access-ph6kj\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r\" (UID: \"9659141d-9a06-4963-b1ec-5982b06ade1c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" Mar 09 09:37:43 crc kubenswrapper[4792]: I0309 09:37:43.814564 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" Mar 09 09:37:44 crc kubenswrapper[4792]: I0309 09:37:44.311131 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r"] Mar 09 09:37:44 crc kubenswrapper[4792]: I0309 09:37:44.407424 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" event={"ID":"9659141d-9a06-4963-b1ec-5982b06ade1c","Type":"ContainerStarted","Data":"1330bf7c653409d8869b82b36e6f511d35b1816be0068187e8643078f5a3440d"} Mar 09 09:37:45 crc kubenswrapper[4792]: I0309 09:37:45.416289 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" event={"ID":"9659141d-9a06-4963-b1ec-5982b06ade1c","Type":"ContainerStarted","Data":"f9b8075982af5100f219dd254519620c93554beab3eda315267691126a6c860e"} Mar 09 09:37:45 crc kubenswrapper[4792]: I0309 09:37:45.433539 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" podStartSLOduration=2.009905378 podStartE2EDuration="2.433523001s" podCreationTimestamp="2026-03-09 09:37:43 +0000 UTC" firstStartedPulling="2026-03-09 09:37:44.326196579 +0000 UTC m=+1829.356397331" lastFinishedPulling="2026-03-09 09:37:44.749814202 +0000 UTC m=+1829.780014954" observedRunningTime="2026-03-09 09:37:45.4282799 +0000 UTC m=+1830.458480652" watchObservedRunningTime="2026-03-09 09:37:45.433523001 +0000 UTC m=+1830.463723753" Mar 09 09:37:49 crc kubenswrapper[4792]: I0309 09:37:49.446121 4792 generic.go:334] "Generic (PLEG): container finished" podID="9659141d-9a06-4963-b1ec-5982b06ade1c" containerID="f9b8075982af5100f219dd254519620c93554beab3eda315267691126a6c860e" exitCode=0 Mar 09 09:37:49 crc kubenswrapper[4792]: I0309 09:37:49.446514 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" event={"ID":"9659141d-9a06-4963-b1ec-5982b06ade1c","Type":"ContainerDied","Data":"f9b8075982af5100f219dd254519620c93554beab3eda315267691126a6c860e"} Mar 09 09:37:50 crc kubenswrapper[4792]: I0309 09:37:50.813039 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" Mar 09 09:37:50 crc kubenswrapper[4792]: I0309 09:37:50.943019 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9659141d-9a06-4963-b1ec-5982b06ade1c-ssh-key-openstack-edpm-ipam\") pod \"9659141d-9a06-4963-b1ec-5982b06ade1c\" (UID: \"9659141d-9a06-4963-b1ec-5982b06ade1c\") " Mar 09 09:37:50 crc kubenswrapper[4792]: I0309 09:37:50.943236 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph6kj\" (UniqueName: \"kubernetes.io/projected/9659141d-9a06-4963-b1ec-5982b06ade1c-kube-api-access-ph6kj\") pod \"9659141d-9a06-4963-b1ec-5982b06ade1c\" (UID: \"9659141d-9a06-4963-b1ec-5982b06ade1c\") " Mar 09 09:37:50 crc kubenswrapper[4792]: I0309 09:37:50.943409 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9659141d-9a06-4963-b1ec-5982b06ade1c-inventory\") pod \"9659141d-9a06-4963-b1ec-5982b06ade1c\" (UID: \"9659141d-9a06-4963-b1ec-5982b06ade1c\") " Mar 09 09:37:50 crc kubenswrapper[4792]: I0309 09:37:50.948261 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9659141d-9a06-4963-b1ec-5982b06ade1c-kube-api-access-ph6kj" (OuterVolumeSpecName: "kube-api-access-ph6kj") pod "9659141d-9a06-4963-b1ec-5982b06ade1c" (UID: "9659141d-9a06-4963-b1ec-5982b06ade1c"). InnerVolumeSpecName "kube-api-access-ph6kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:37:50 crc kubenswrapper[4792]: I0309 09:37:50.967801 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9659141d-9a06-4963-b1ec-5982b06ade1c-inventory" (OuterVolumeSpecName: "inventory") pod "9659141d-9a06-4963-b1ec-5982b06ade1c" (UID: "9659141d-9a06-4963-b1ec-5982b06ade1c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:50 crc kubenswrapper[4792]: I0309 09:37:50.968235 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9659141d-9a06-4963-b1ec-5982b06ade1c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9659141d-9a06-4963-b1ec-5982b06ade1c" (UID: "9659141d-9a06-4963-b1ec-5982b06ade1c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.047349 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph6kj\" (UniqueName: \"kubernetes.io/projected/9659141d-9a06-4963-b1ec-5982b06ade1c-kube-api-access-ph6kj\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.047471 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9659141d-9a06-4963-b1ec-5982b06ade1c-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.047488 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9659141d-9a06-4963-b1ec-5982b06ade1c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.461669 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" event={"ID":"9659141d-9a06-4963-b1ec-5982b06ade1c","Type":"ContainerDied","Data":"1330bf7c653409d8869b82b36e6f511d35b1816be0068187e8643078f5a3440d"} Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.461933 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1330bf7c653409d8869b82b36e6f511d35b1816be0068187e8643078f5a3440d" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.461738 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.532020 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h"] Mar 09 09:37:51 crc kubenswrapper[4792]: E0309 09:37:51.532381 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9659141d-9a06-4963-b1ec-5982b06ade1c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.532399 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9659141d-9a06-4963-b1ec-5982b06ade1c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.532569 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9659141d-9a06-4963-b1ec-5982b06ade1c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.533099 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.534641 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.534818 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.535467 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.540528 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.551058 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h"] Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.560409 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c745abb2-062a-4a6f-9360-e06bf80639d0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h\" (UID: \"c745abb2-062a-4a6f-9360-e06bf80639d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.560465 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ppk7\" (UniqueName: \"kubernetes.io/projected/c745abb2-062a-4a6f-9360-e06bf80639d0-kube-api-access-2ppk7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h\" (UID: \"c745abb2-062a-4a6f-9360-e06bf80639d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.560538 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c745abb2-062a-4a6f-9360-e06bf80639d0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h\" (UID: \"c745abb2-062a-4a6f-9360-e06bf80639d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.662140 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c745abb2-062a-4a6f-9360-e06bf80639d0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h\" (UID: \"c745abb2-062a-4a6f-9360-e06bf80639d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.662234 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ppk7\" (UniqueName: \"kubernetes.io/projected/c745abb2-062a-4a6f-9360-e06bf80639d0-kube-api-access-2ppk7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h\" (UID: \"c745abb2-062a-4a6f-9360-e06bf80639d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.662315 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c745abb2-062a-4a6f-9360-e06bf80639d0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h\" (UID: \"c745abb2-062a-4a6f-9360-e06bf80639d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.668451 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c745abb2-062a-4a6f-9360-e06bf80639d0-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h\" (UID: \"c745abb2-062a-4a6f-9360-e06bf80639d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.675861 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c745abb2-062a-4a6f-9360-e06bf80639d0-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h\" (UID: \"c745abb2-062a-4a6f-9360-e06bf80639d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.683343 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ppk7\" (UniqueName: \"kubernetes.io/projected/c745abb2-062a-4a6f-9360-e06bf80639d0-kube-api-access-2ppk7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h\" (UID: \"c745abb2-062a-4a6f-9360-e06bf80639d0\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" Mar 09 09:37:51 crc kubenswrapper[4792]: I0309 09:37:51.859674 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" Mar 09 09:37:52 crc kubenswrapper[4792]: I0309 09:37:52.353935 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h"] Mar 09 09:37:52 crc kubenswrapper[4792]: I0309 09:37:52.469514 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" event={"ID":"c745abb2-062a-4a6f-9360-e06bf80639d0","Type":"ContainerStarted","Data":"34640203c454a7c2b37141cf94800569fe0d1091c9bd840e826f94f6906639e4"} Mar 09 09:37:53 crc kubenswrapper[4792]: I0309 09:37:53.478123 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" event={"ID":"c745abb2-062a-4a6f-9360-e06bf80639d0","Type":"ContainerStarted","Data":"7ef4eff7b492037a81e29ce5a84bb0e36f8ba627b5bc92b5d5936075c95f43ef"} Mar 09 09:37:53 crc kubenswrapper[4792]: I0309 09:37:53.493273 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" podStartSLOduration=2.063706832 podStartE2EDuration="2.493254625s" podCreationTimestamp="2026-03-09 09:37:51 +0000 UTC" firstStartedPulling="2026-03-09 09:37:52.37122355 +0000 UTC m=+1837.401424302" lastFinishedPulling="2026-03-09 09:37:52.800771333 +0000 UTC m=+1837.830972095" observedRunningTime="2026-03-09 09:37:53.49169092 +0000 UTC m=+1838.521891672" watchObservedRunningTime="2026-03-09 09:37:53.493254625 +0000 UTC m=+1838.523455377" Mar 09 09:37:53 crc kubenswrapper[4792]: I0309 09:37:53.663433 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:37:53 crc kubenswrapper[4792]: E0309 09:37:53.663672 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:38:00 crc kubenswrapper[4792]: I0309 09:38:00.129160 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550818-jtvhj"] Mar 09 09:38:00 crc kubenswrapper[4792]: I0309 09:38:00.130802 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550818-jtvhj" Mar 09 09:38:00 crc kubenswrapper[4792]: I0309 09:38:00.134316 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:38:00 crc kubenswrapper[4792]: I0309 09:38:00.134877 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:38:00 crc kubenswrapper[4792]: I0309 09:38:00.141736 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:38:00 crc kubenswrapper[4792]: I0309 09:38:00.146780 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550818-jtvhj"] Mar 09 09:38:00 crc kubenswrapper[4792]: I0309 09:38:00.220008 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmmbd\" (UniqueName: \"kubernetes.io/projected/42fd3efa-7e1b-4ff6-a3c6-9947f805675c-kube-api-access-bmmbd\") pod \"auto-csr-approver-29550818-jtvhj\" (UID: \"42fd3efa-7e1b-4ff6-a3c6-9947f805675c\") " pod="openshift-infra/auto-csr-approver-29550818-jtvhj" Mar 09 09:38:00 crc kubenswrapper[4792]: I0309 09:38:00.321994 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmmbd\" (UniqueName: \"kubernetes.io/projected/42fd3efa-7e1b-4ff6-a3c6-9947f805675c-kube-api-access-bmmbd\") pod \"auto-csr-approver-29550818-jtvhj\" (UID: \"42fd3efa-7e1b-4ff6-a3c6-9947f805675c\") " pod="openshift-infra/auto-csr-approver-29550818-jtvhj" Mar 09 09:38:00 crc kubenswrapper[4792]: I0309 09:38:00.347483 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmmbd\" (UniqueName: \"kubernetes.io/projected/42fd3efa-7e1b-4ff6-a3c6-9947f805675c-kube-api-access-bmmbd\") pod \"auto-csr-approver-29550818-jtvhj\" (UID: \"42fd3efa-7e1b-4ff6-a3c6-9947f805675c\") " pod="openshift-infra/auto-csr-approver-29550818-jtvhj" Mar 09 09:38:00 crc kubenswrapper[4792]: I0309 09:38:00.448881 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550818-jtvhj" Mar 09 09:38:00 crc kubenswrapper[4792]: I0309 09:38:00.880424 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550818-jtvhj"] Mar 09 09:38:01 crc kubenswrapper[4792]: I0309 09:38:01.546369 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550818-jtvhj" event={"ID":"42fd3efa-7e1b-4ff6-a3c6-9947f805675c","Type":"ContainerStarted","Data":"34ce6e23153824550b44480f3adc2866c6860f22fb2fae0b12ed7ae31503b882"} Mar 09 09:38:02 crc kubenswrapper[4792]: I0309 09:38:02.560970 4792 generic.go:334] "Generic (PLEG): container finished" podID="42fd3efa-7e1b-4ff6-a3c6-9947f805675c" containerID="bb5c7c41e531ab9fb4118589739801dfee310d40a7851a9f88de1b9d86798ab4" exitCode=0 Mar 09 09:38:02 crc kubenswrapper[4792]: I0309 09:38:02.561248 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550818-jtvhj" event={"ID":"42fd3efa-7e1b-4ff6-a3c6-9947f805675c","Type":"ContainerDied","Data":"bb5c7c41e531ab9fb4118589739801dfee310d40a7851a9f88de1b9d86798ab4"} Mar 09 09:38:03 crc kubenswrapper[4792]: I0309 09:38:03.827458 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550818-jtvhj" Mar 09 09:38:03 crc kubenswrapper[4792]: I0309 09:38:03.989770 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmmbd\" (UniqueName: \"kubernetes.io/projected/42fd3efa-7e1b-4ff6-a3c6-9947f805675c-kube-api-access-bmmbd\") pod \"42fd3efa-7e1b-4ff6-a3c6-9947f805675c\" (UID: \"42fd3efa-7e1b-4ff6-a3c6-9947f805675c\") " Mar 09 09:38:04 crc kubenswrapper[4792]: I0309 09:38:04.000917 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42fd3efa-7e1b-4ff6-a3c6-9947f805675c-kube-api-access-bmmbd" (OuterVolumeSpecName: "kube-api-access-bmmbd") pod "42fd3efa-7e1b-4ff6-a3c6-9947f805675c" (UID: "42fd3efa-7e1b-4ff6-a3c6-9947f805675c"). InnerVolumeSpecName "kube-api-access-bmmbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:38:04 crc kubenswrapper[4792]: I0309 09:38:04.091742 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmmbd\" (UniqueName: \"kubernetes.io/projected/42fd3efa-7e1b-4ff6-a3c6-9947f805675c-kube-api-access-bmmbd\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:04 crc kubenswrapper[4792]: I0309 09:38:04.577589 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550818-jtvhj" event={"ID":"42fd3efa-7e1b-4ff6-a3c6-9947f805675c","Type":"ContainerDied","Data":"34ce6e23153824550b44480f3adc2866c6860f22fb2fae0b12ed7ae31503b882"} Mar 09 09:38:04 crc kubenswrapper[4792]: I0309 09:38:04.577639 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34ce6e23153824550b44480f3adc2866c6860f22fb2fae0b12ed7ae31503b882" Mar 09 09:38:04 crc kubenswrapper[4792]: I0309 09:38:04.577646 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550818-jtvhj" Mar 09 09:38:04 crc kubenswrapper[4792]: I0309 09:38:04.895487 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550812-9npxv"] Mar 09 09:38:04 crc kubenswrapper[4792]: I0309 09:38:04.903197 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550812-9npxv"] Mar 09 09:38:05 crc kubenswrapper[4792]: I0309 09:38:05.679526 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38252055-92e3-4cf5-8f1a-78ad77a78f62" path="/var/lib/kubelet/pods/38252055-92e3-4cf5-8f1a-78ad77a78f62/volumes" Mar 09 09:38:06 crc kubenswrapper[4792]: I0309 09:38:06.663232 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:38:06 crc kubenswrapper[4792]: E0309 09:38:06.664515 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:38:15 crc kubenswrapper[4792]: I0309 09:38:15.035728 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4584-account-create-update-8wfnh"] Mar 09 09:38:15 crc kubenswrapper[4792]: I0309 09:38:15.042548 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zrzlq"] Mar 09 09:38:15 crc kubenswrapper[4792]: I0309 09:38:15.049565 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-j85dq"] Mar 09 09:38:15 crc kubenswrapper[4792]: I0309 09:38:15.059608 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4584-account-create-update-8wfnh"] Mar 09 09:38:15 crc kubenswrapper[4792]: I0309 09:38:15.073466 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zrzlq"] Mar 09 09:38:15 crc kubenswrapper[4792]: I0309 09:38:15.081744 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-j85dq"] Mar 09 09:38:15 crc kubenswrapper[4792]: I0309 09:38:15.692251 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ce1d921-17fb-427e-bc2b-9df3487b0e5b" path="/var/lib/kubelet/pods/3ce1d921-17fb-427e-bc2b-9df3487b0e5b/volumes" Mar 09 09:38:15 crc kubenswrapper[4792]: I0309 09:38:15.693887 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d28e5b5b-e371-4a9d-8725-017aa98ac944" path="/var/lib/kubelet/pods/d28e5b5b-e371-4a9d-8725-017aa98ac944/volumes" Mar 09 09:38:15 crc kubenswrapper[4792]: I0309 09:38:15.694951 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6bda2b4-31d6-4059-a494-330f9fa2c1f9" path="/var/lib/kubelet/pods/e6bda2b4-31d6-4059-a494-330f9fa2c1f9/volumes" Mar 09 09:38:16 crc kubenswrapper[4792]: I0309 09:38:16.042342 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-fba7-account-create-update-lf5dc"] Mar 09 09:38:16 crc kubenswrapper[4792]: I0309 09:38:16.056881 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-8785d"] Mar 09 09:38:16 crc kubenswrapper[4792]: I0309 09:38:16.064537 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5698-account-create-update-twzbh"] Mar 09 09:38:16 crc kubenswrapper[4792]: I0309 09:38:16.071844 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-fba7-account-create-update-lf5dc"] Mar 09 09:38:16 crc kubenswrapper[4792]: I0309 09:38:16.080052 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-8785d"] Mar 09 09:38:16 crc kubenswrapper[4792]: I0309 09:38:16.087890 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5698-account-create-update-twzbh"] Mar 09 09:38:17 crc kubenswrapper[4792]: I0309 09:38:17.677733 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1927dc12-81aa-463b-b356-3784c72f3245" path="/var/lib/kubelet/pods/1927dc12-81aa-463b-b356-3784c72f3245/volumes" Mar 09 09:38:17 crc kubenswrapper[4792]: I0309 09:38:17.679140 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68422fad-6d4a-4d8a-ac66-b68dd7486525" path="/var/lib/kubelet/pods/68422fad-6d4a-4d8a-ac66-b68dd7486525/volumes" Mar 09 09:38:17 crc kubenswrapper[4792]: I0309 09:38:17.679865 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5d6bee9-1576-4852-a0ca-f4da5d2930f9" path="/var/lib/kubelet/pods/e5d6bee9-1576-4852-a0ca-f4da5d2930f9/volumes" Mar 09 09:38:20 crc kubenswrapper[4792]: I0309 09:38:20.662764 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:38:20 crc kubenswrapper[4792]: E0309 09:38:20.663415 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:38:33 crc kubenswrapper[4792]: I0309 09:38:33.662801 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:38:33 crc kubenswrapper[4792]: E0309 09:38:33.663636 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:38:38 crc kubenswrapper[4792]: I0309 09:38:38.054427 4792 scope.go:117] "RemoveContainer" containerID="1d9d918c338f3ee7fa4a3ec993e09b177e706742352483714152175b9bce5112" Mar 09 09:38:38 crc kubenswrapper[4792]: I0309 09:38:38.086042 4792 scope.go:117] "RemoveContainer" containerID="3ceba6ca0d35b2da076cc69ac19e717443ca1b7d82eb6e577eb1dac2b2517c61" Mar 09 09:38:38 crc kubenswrapper[4792]: I0309 09:38:38.117447 4792 scope.go:117] "RemoveContainer" containerID="b76b345f35a9ad624368a0287e6fdbad3e9ae18371832947178fe555f8c84e3e" Mar 09 09:38:38 crc kubenswrapper[4792]: I0309 09:38:38.164384 4792 scope.go:117] "RemoveContainer" containerID="98444155077eba96cc75a0a872affcea3af54654aaacdc1f3df45ee6a747a600" Mar 09 09:38:38 crc kubenswrapper[4792]: I0309 09:38:38.193038 4792 scope.go:117] "RemoveContainer" containerID="b2f8ea7a4938181d7530ee3ceecb654d54cfb49874f2ba7fcdf1718cf22dc91b" Mar 09 09:38:38 crc kubenswrapper[4792]: I0309 09:38:38.237274 4792 scope.go:117] "RemoveContainer" containerID="f1d51af9d96a53944f9002089c2cff0830481349e3a0fd1845e4e9b7d1cfa5a2" Mar 09 09:38:38 crc kubenswrapper[4792]: I0309 09:38:38.272158 4792 scope.go:117] "RemoveContainer" containerID="4a87b2d0dd2216cb73b6d522b398544e2869d69b53f4b248198f871f2cfdfd8e" Mar 09 09:38:39 crc kubenswrapper[4792]: I0309 09:38:39.852135 4792 generic.go:334] "Generic (PLEG): container finished" podID="c745abb2-062a-4a6f-9360-e06bf80639d0" containerID="7ef4eff7b492037a81e29ce5a84bb0e36f8ba627b5bc92b5d5936075c95f43ef" exitCode=0 Mar 09 09:38:39 crc kubenswrapper[4792]: I0309 09:38:39.852174 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" event={"ID":"c745abb2-062a-4a6f-9360-e06bf80639d0","Type":"ContainerDied","Data":"7ef4eff7b492037a81e29ce5a84bb0e36f8ba627b5bc92b5d5936075c95f43ef"} Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.234832 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.403751 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c745abb2-062a-4a6f-9360-e06bf80639d0-ssh-key-openstack-edpm-ipam\") pod \"c745abb2-062a-4a6f-9360-e06bf80639d0\" (UID: \"c745abb2-062a-4a6f-9360-e06bf80639d0\") " Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.403951 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ppk7\" (UniqueName: \"kubernetes.io/projected/c745abb2-062a-4a6f-9360-e06bf80639d0-kube-api-access-2ppk7\") pod \"c745abb2-062a-4a6f-9360-e06bf80639d0\" (UID: \"c745abb2-062a-4a6f-9360-e06bf80639d0\") " Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.404029 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c745abb2-062a-4a6f-9360-e06bf80639d0-inventory\") pod \"c745abb2-062a-4a6f-9360-e06bf80639d0\" (UID: \"c745abb2-062a-4a6f-9360-e06bf80639d0\") " Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.418245 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c745abb2-062a-4a6f-9360-e06bf80639d0-kube-api-access-2ppk7" (OuterVolumeSpecName: "kube-api-access-2ppk7") pod "c745abb2-062a-4a6f-9360-e06bf80639d0" (UID: "c745abb2-062a-4a6f-9360-e06bf80639d0"). InnerVolumeSpecName "kube-api-access-2ppk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.432194 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c745abb2-062a-4a6f-9360-e06bf80639d0-inventory" (OuterVolumeSpecName: "inventory") pod "c745abb2-062a-4a6f-9360-e06bf80639d0" (UID: "c745abb2-062a-4a6f-9360-e06bf80639d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.442240 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c745abb2-062a-4a6f-9360-e06bf80639d0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c745abb2-062a-4a6f-9360-e06bf80639d0" (UID: "c745abb2-062a-4a6f-9360-e06bf80639d0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.505681 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c745abb2-062a-4a6f-9360-e06bf80639d0-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.505715 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c745abb2-062a-4a6f-9360-e06bf80639d0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.505726 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ppk7\" (UniqueName: \"kubernetes.io/projected/c745abb2-062a-4a6f-9360-e06bf80639d0-kube-api-access-2ppk7\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.874566 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" event={"ID":"c745abb2-062a-4a6f-9360-e06bf80639d0","Type":"ContainerDied","Data":"34640203c454a7c2b37141cf94800569fe0d1091c9bd840e826f94f6906639e4"} Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.874604 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34640203c454a7c2b37141cf94800569fe0d1091c9bd840e826f94f6906639e4" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.874661 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.956933 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4qk8n"] Mar 09 09:38:41 crc kubenswrapper[4792]: E0309 09:38:41.957358 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c745abb2-062a-4a6f-9360-e06bf80639d0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.957376 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c745abb2-062a-4a6f-9360-e06bf80639d0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:38:41 crc kubenswrapper[4792]: E0309 09:38:41.957413 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fd3efa-7e1b-4ff6-a3c6-9947f805675c" containerName="oc" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.957422 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fd3efa-7e1b-4ff6-a3c6-9947f805675c" containerName="oc" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.957626 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c745abb2-062a-4a6f-9360-e06bf80639d0" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.957656 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="42fd3efa-7e1b-4ff6-a3c6-9947f805675c" containerName="oc" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.958954 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.960861 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.961165 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.961415 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.961582 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:38:41 crc kubenswrapper[4792]: I0309 09:38:41.974574 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4qk8n"] Mar 09 09:38:42 crc kubenswrapper[4792]: I0309 09:38:42.116793 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4qk8n\" (UID: \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" Mar 09 09:38:42 crc kubenswrapper[4792]: I0309 09:38:42.117329 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4qk8n\" (UID: \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" Mar 09 09:38:42 crc kubenswrapper[4792]: I0309 09:38:42.117412 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wxx5\" (UniqueName: \"kubernetes.io/projected/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-kube-api-access-7wxx5\") pod \"ssh-known-hosts-edpm-deployment-4qk8n\" (UID: \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" Mar 09 09:38:42 crc kubenswrapper[4792]: I0309 09:38:42.218885 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4qk8n\" (UID: \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" Mar 09 09:38:42 crc kubenswrapper[4792]: I0309 09:38:42.218963 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wxx5\" (UniqueName: \"kubernetes.io/projected/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-kube-api-access-7wxx5\") pod \"ssh-known-hosts-edpm-deployment-4qk8n\" (UID: \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" Mar 09 09:38:42 crc kubenswrapper[4792]: I0309 09:38:42.219002 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4qk8n\" (UID: \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" Mar 09 09:38:42 crc kubenswrapper[4792]: I0309 09:38:42.224032 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4qk8n\" (UID: \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" Mar 09 09:38:42 crc kubenswrapper[4792]: I0309 09:38:42.224178 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4qk8n\" (UID: \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" Mar 09 09:38:42 crc kubenswrapper[4792]: I0309 09:38:42.236381 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wxx5\" (UniqueName: \"kubernetes.io/projected/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-kube-api-access-7wxx5\") pod \"ssh-known-hosts-edpm-deployment-4qk8n\" (UID: \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\") " pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" Mar 09 09:38:42 crc kubenswrapper[4792]: I0309 09:38:42.298466 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" Mar 09 09:38:42 crc kubenswrapper[4792]: I0309 09:38:42.824860 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4qk8n"] Mar 09 09:38:42 crc kubenswrapper[4792]: W0309 09:38:42.833209 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb16ebd2e_e43b_40cb_9b63_76707c9a25d6.slice/crio-9a96f0a09f5dc4a2fd557cbb8dccf081a025c16bd6af874172ad0f4e4b4176fb WatchSource:0}: Error finding container 9a96f0a09f5dc4a2fd557cbb8dccf081a025c16bd6af874172ad0f4e4b4176fb: Status 404 returned error can't find the container with id 9a96f0a09f5dc4a2fd557cbb8dccf081a025c16bd6af874172ad0f4e4b4176fb Mar 09 09:38:42 crc kubenswrapper[4792]: I0309 09:38:42.884277 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" event={"ID":"b16ebd2e-e43b-40cb-9b63-76707c9a25d6","Type":"ContainerStarted","Data":"9a96f0a09f5dc4a2fd557cbb8dccf081a025c16bd6af874172ad0f4e4b4176fb"} Mar 09 09:38:43 crc kubenswrapper[4792]: I0309 09:38:43.895046 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" event={"ID":"b16ebd2e-e43b-40cb-9b63-76707c9a25d6","Type":"ContainerStarted","Data":"73caaf373715300262a340afc5b3fd4cd5f21307a60c1cc6a6e77576ee1cd749"} Mar 09 09:38:43 crc kubenswrapper[4792]: I0309 09:38:43.925375 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" podStartSLOduration=2.523699652 podStartE2EDuration="2.92534168s" podCreationTimestamp="2026-03-09 09:38:41 +0000 UTC" firstStartedPulling="2026-03-09 09:38:42.835009849 +0000 UTC m=+1887.865210601" lastFinishedPulling="2026-03-09 09:38:43.236651877 +0000 UTC m=+1888.266852629" observedRunningTime="2026-03-09 09:38:43.919970425 +0000 UTC m=+1888.950171177" watchObservedRunningTime="2026-03-09 09:38:43.92534168 +0000 UTC m=+1888.955542432" Mar 09 09:38:44 crc kubenswrapper[4792]: I0309 09:38:44.662571 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:38:44 crc kubenswrapper[4792]: I0309 09:38:44.904206 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"6f5e8f72786750c34390c3921b89a9937cb9dce363f8ebe7772b408843cd4da4"} Mar 09 09:38:49 crc kubenswrapper[4792]: I0309 09:38:49.958658 4792 generic.go:334] "Generic (PLEG): container finished" podID="b16ebd2e-e43b-40cb-9b63-76707c9a25d6" containerID="73caaf373715300262a340afc5b3fd4cd5f21307a60c1cc6a6e77576ee1cd749" exitCode=0 Mar 09 09:38:49 crc kubenswrapper[4792]: I0309 09:38:49.959437 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" event={"ID":"b16ebd2e-e43b-40cb-9b63-76707c9a25d6","Type":"ContainerDied","Data":"73caaf373715300262a340afc5b3fd4cd5f21307a60c1cc6a6e77576ee1cd749"} Mar 09 09:38:51 crc kubenswrapper[4792]: I0309 09:38:51.345862 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" Mar 09 09:38:51 crc kubenswrapper[4792]: I0309 09:38:51.484846 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-inventory-0\") pod \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\" (UID: \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\") " Mar 09 09:38:51 crc kubenswrapper[4792]: I0309 09:38:51.485044 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wxx5\" (UniqueName: \"kubernetes.io/projected/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-kube-api-access-7wxx5\") pod \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\" (UID: \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\") " Mar 09 09:38:51 crc kubenswrapper[4792]: I0309 09:38:51.485131 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-ssh-key-openstack-edpm-ipam\") pod \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\" (UID: \"b16ebd2e-e43b-40cb-9b63-76707c9a25d6\") " Mar 09 09:38:51 crc kubenswrapper[4792]: I0309 09:38:51.491884 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-kube-api-access-7wxx5" (OuterVolumeSpecName: "kube-api-access-7wxx5") pod "b16ebd2e-e43b-40cb-9b63-76707c9a25d6" (UID: "b16ebd2e-e43b-40cb-9b63-76707c9a25d6"). InnerVolumeSpecName "kube-api-access-7wxx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:38:51 crc kubenswrapper[4792]: I0309 09:38:51.509287 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "b16ebd2e-e43b-40cb-9b63-76707c9a25d6" (UID: "b16ebd2e-e43b-40cb-9b63-76707c9a25d6"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:51 crc kubenswrapper[4792]: I0309 09:38:51.512670 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b16ebd2e-e43b-40cb-9b63-76707c9a25d6" (UID: "b16ebd2e-e43b-40cb-9b63-76707c9a25d6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:51 crc kubenswrapper[4792]: I0309 09:38:51.587549 4792 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:51 crc kubenswrapper[4792]: I0309 09:38:51.587587 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wxx5\" (UniqueName: \"kubernetes.io/projected/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-kube-api-access-7wxx5\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:51 crc kubenswrapper[4792]: I0309 09:38:51.587600 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b16ebd2e-e43b-40cb-9b63-76707c9a25d6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:51 crc kubenswrapper[4792]: I0309 09:38:51.975246 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" event={"ID":"b16ebd2e-e43b-40cb-9b63-76707c9a25d6","Type":"ContainerDied","Data":"9a96f0a09f5dc4a2fd557cbb8dccf081a025c16bd6af874172ad0f4e4b4176fb"} Mar 09 09:38:51 crc kubenswrapper[4792]: I0309 09:38:51.975579 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a96f0a09f5dc4a2fd557cbb8dccf081a025c16bd6af874172ad0f4e4b4176fb" Mar 09 09:38:51 crc kubenswrapper[4792]: I0309 09:38:51.975294 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4qk8n" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.089786 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-57vwq"] Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.101922 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-57vwq"] Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.109906 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm"] Mar 09 09:38:52 crc kubenswrapper[4792]: E0309 09:38:52.110419 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16ebd2e-e43b-40cb-9b63-76707c9a25d6" containerName="ssh-known-hosts-edpm-deployment" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.110441 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16ebd2e-e43b-40cb-9b63-76707c9a25d6" containerName="ssh-known-hosts-edpm-deployment" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.110684 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b16ebd2e-e43b-40cb-9b63-76707c9a25d6" containerName="ssh-known-hosts-edpm-deployment" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.111330 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.114250 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.114298 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.114448 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.114535 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.117867 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm"] Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.197048 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvmf4\" (UniqueName: \"kubernetes.io/projected/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-kube-api-access-mvmf4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pd6cm\" (UID: \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.197133 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pd6cm\" (UID: \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.197195 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pd6cm\" (UID: \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.298940 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pd6cm\" (UID: \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.299028 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pd6cm\" (UID: \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.299158 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvmf4\" (UniqueName: \"kubernetes.io/projected/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-kube-api-access-mvmf4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pd6cm\" (UID: \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.307730 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pd6cm\" (UID: \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.316567 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pd6cm\" (UID: \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.316587 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvmf4\" (UniqueName: \"kubernetes.io/projected/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-kube-api-access-mvmf4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pd6cm\" (UID: \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" Mar 09 09:38:52 crc kubenswrapper[4792]: I0309 09:38:52.426563 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" Mar 09 09:38:53 crc kubenswrapper[4792]: W0309 09:38:53.018917 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9bb4ffd_65c0_45ae_b2b8_7f8c718500de.slice/crio-a13c1e65fff1234d415131eeb28021b554e8d566ab8a222f02d9abeda2873d12 WatchSource:0}: Error finding container a13c1e65fff1234d415131eeb28021b554e8d566ab8a222f02d9abeda2873d12: Status 404 returned error can't find the container with id a13c1e65fff1234d415131eeb28021b554e8d566ab8a222f02d9abeda2873d12 Mar 09 09:38:53 crc kubenswrapper[4792]: I0309 09:38:53.027913 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm"] Mar 09 09:38:53 crc kubenswrapper[4792]: I0309 09:38:53.679687 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65c96452-6003-4585-b71c-dbb398600617" path="/var/lib/kubelet/pods/65c96452-6003-4585-b71c-dbb398600617/volumes" Mar 09 09:38:53 crc kubenswrapper[4792]: I0309 09:38:53.992811 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" event={"ID":"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de","Type":"ContainerStarted","Data":"3af016df023633048f7a8fd280fffb4aa745eb6a9f65fc6a9b4c578654cfb6cf"} Mar 09 09:38:53 crc kubenswrapper[4792]: I0309 09:38:53.993201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" event={"ID":"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de","Type":"ContainerStarted","Data":"a13c1e65fff1234d415131eeb28021b554e8d566ab8a222f02d9abeda2873d12"} Mar 09 09:38:54 crc kubenswrapper[4792]: I0309 09:38:54.019670 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" podStartSLOduration=1.423014875 podStartE2EDuration="2.019653096s" podCreationTimestamp="2026-03-09 09:38:52 +0000 UTC" firstStartedPulling="2026-03-09 09:38:53.021867832 +0000 UTC m=+1898.052068584" lastFinishedPulling="2026-03-09 09:38:53.618506053 +0000 UTC m=+1898.648706805" observedRunningTime="2026-03-09 09:38:54.011343587 +0000 UTC m=+1899.041544359" watchObservedRunningTime="2026-03-09 09:38:54.019653096 +0000 UTC m=+1899.049853848" Mar 09 09:39:02 crc kubenswrapper[4792]: I0309 09:39:02.057246 4792 generic.go:334] "Generic (PLEG): container finished" podID="b9bb4ffd-65c0-45ae-b2b8-7f8c718500de" containerID="3af016df023633048f7a8fd280fffb4aa745eb6a9f65fc6a9b4c578654cfb6cf" exitCode=0 Mar 09 09:39:02 crc kubenswrapper[4792]: I0309 09:39:02.057402 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" event={"ID":"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de","Type":"ContainerDied","Data":"3af016df023633048f7a8fd280fffb4aa745eb6a9f65fc6a9b4c578654cfb6cf"} Mar 09 09:39:03 crc kubenswrapper[4792]: I0309 09:39:03.658664 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" Mar 09 09:39:03 crc kubenswrapper[4792]: I0309 09:39:03.726364 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-inventory\") pod \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\" (UID: \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\") " Mar 09 09:39:03 crc kubenswrapper[4792]: I0309 09:39:03.727427 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-ssh-key-openstack-edpm-ipam\") pod \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\" (UID: \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\") " Mar 09 09:39:03 crc kubenswrapper[4792]: I0309 09:39:03.727482 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvmf4\" (UniqueName: \"kubernetes.io/projected/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-kube-api-access-mvmf4\") pod \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\" (UID: \"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de\") " Mar 09 09:39:03 crc kubenswrapper[4792]: I0309 09:39:03.753307 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-kube-api-access-mvmf4" (OuterVolumeSpecName: "kube-api-access-mvmf4") pod "b9bb4ffd-65c0-45ae-b2b8-7f8c718500de" (UID: "b9bb4ffd-65c0-45ae-b2b8-7f8c718500de"). InnerVolumeSpecName "kube-api-access-mvmf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:03 crc kubenswrapper[4792]: I0309 09:39:03.783180 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-inventory" (OuterVolumeSpecName: "inventory") pod "b9bb4ffd-65c0-45ae-b2b8-7f8c718500de" (UID: "b9bb4ffd-65c0-45ae-b2b8-7f8c718500de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:03 crc kubenswrapper[4792]: I0309 09:39:03.786188 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b9bb4ffd-65c0-45ae-b2b8-7f8c718500de" (UID: "b9bb4ffd-65c0-45ae-b2b8-7f8c718500de"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:03 crc kubenswrapper[4792]: I0309 09:39:03.830054 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:03 crc kubenswrapper[4792]: I0309 09:39:03.830329 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:03 crc kubenswrapper[4792]: I0309 09:39:03.830386 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvmf4\" (UniqueName: \"kubernetes.io/projected/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de-kube-api-access-mvmf4\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.076662 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" event={"ID":"b9bb4ffd-65c0-45ae-b2b8-7f8c718500de","Type":"ContainerDied","Data":"a13c1e65fff1234d415131eeb28021b554e8d566ab8a222f02d9abeda2873d12"} Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.076712 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a13c1e65fff1234d415131eeb28021b554e8d566ab8a222f02d9abeda2873d12" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.076747 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.221775 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x"] Mar 09 09:39:04 crc kubenswrapper[4792]: E0309 09:39:04.222290 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9bb4ffd-65c0-45ae-b2b8-7f8c718500de" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.222309 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bb4ffd-65c0-45ae-b2b8-7f8c718500de" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.222485 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9bb4ffd-65c0-45ae-b2b8-7f8c718500de" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.223100 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.226450 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.226570 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.226818 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.228282 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.233030 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x"] Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.340570 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3169dde4-b88a-4a42-b22c-452ed9d2b945-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x\" (UID: \"3169dde4-b88a-4a42-b22c-452ed9d2b945\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.340905 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbkfv\" (UniqueName: \"kubernetes.io/projected/3169dde4-b88a-4a42-b22c-452ed9d2b945-kube-api-access-vbkfv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x\" (UID: \"3169dde4-b88a-4a42-b22c-452ed9d2b945\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.341171 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3169dde4-b88a-4a42-b22c-452ed9d2b945-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x\" (UID: \"3169dde4-b88a-4a42-b22c-452ed9d2b945\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.443389 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3169dde4-b88a-4a42-b22c-452ed9d2b945-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x\" (UID: \"3169dde4-b88a-4a42-b22c-452ed9d2b945\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.443500 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbkfv\" (UniqueName: \"kubernetes.io/projected/3169dde4-b88a-4a42-b22c-452ed9d2b945-kube-api-access-vbkfv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x\" (UID: \"3169dde4-b88a-4a42-b22c-452ed9d2b945\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.443550 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3169dde4-b88a-4a42-b22c-452ed9d2b945-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x\" (UID: \"3169dde4-b88a-4a42-b22c-452ed9d2b945\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.447918 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3169dde4-b88a-4a42-b22c-452ed9d2b945-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x\" (UID: \"3169dde4-b88a-4a42-b22c-452ed9d2b945\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.448013 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3169dde4-b88a-4a42-b22c-452ed9d2b945-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x\" (UID: \"3169dde4-b88a-4a42-b22c-452ed9d2b945\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.460272 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbkfv\" (UniqueName: \"kubernetes.io/projected/3169dde4-b88a-4a42-b22c-452ed9d2b945-kube-api-access-vbkfv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x\" (UID: \"3169dde4-b88a-4a42-b22c-452ed9d2b945\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" Mar 09 09:39:04 crc kubenswrapper[4792]: I0309 09:39:04.540057 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" Mar 09 09:39:05 crc kubenswrapper[4792]: I0309 09:39:05.108559 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x"] Mar 09 09:39:05 crc kubenswrapper[4792]: W0309 09:39:05.132680 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3169dde4_b88a_4a42_b22c_452ed9d2b945.slice/crio-48e76db52b0eb6c1695d8a20be6eb82906f8e2f1a4aa7ecb3d5b2385ba1498d4 WatchSource:0}: Error finding container 48e76db52b0eb6c1695d8a20be6eb82906f8e2f1a4aa7ecb3d5b2385ba1498d4: Status 404 returned error can't find the container with id 48e76db52b0eb6c1695d8a20be6eb82906f8e2f1a4aa7ecb3d5b2385ba1498d4 Mar 09 09:39:06 crc kubenswrapper[4792]: I0309 09:39:06.094604 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" event={"ID":"3169dde4-b88a-4a42-b22c-452ed9d2b945","Type":"ContainerStarted","Data":"9a835914b405660246ab4fb83ec4285a3ed5ea8d17add8fc5583b2c3f86e82f8"} Mar 09 09:39:06 crc kubenswrapper[4792]: I0309 09:39:06.095015 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" event={"ID":"3169dde4-b88a-4a42-b22c-452ed9d2b945","Type":"ContainerStarted","Data":"48e76db52b0eb6c1695d8a20be6eb82906f8e2f1a4aa7ecb3d5b2385ba1498d4"} Mar 09 09:39:06 crc kubenswrapper[4792]: I0309 09:39:06.113187 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" podStartSLOduration=1.713562566 podStartE2EDuration="2.113167425s" podCreationTimestamp="2026-03-09 09:39:04 +0000 UTC" firstStartedPulling="2026-03-09 09:39:05.135238734 +0000 UTC m=+1910.165439486" lastFinishedPulling="2026-03-09 09:39:05.534843593 +0000 UTC m=+1910.565044345" observedRunningTime="2026-03-09 09:39:06.108172912 +0000 UTC m=+1911.138373684" watchObservedRunningTime="2026-03-09 09:39:06.113167425 +0000 UTC m=+1911.143368177" Mar 09 09:39:16 crc kubenswrapper[4792]: I0309 09:39:16.169960 4792 generic.go:334] "Generic (PLEG): container finished" podID="3169dde4-b88a-4a42-b22c-452ed9d2b945" containerID="9a835914b405660246ab4fb83ec4285a3ed5ea8d17add8fc5583b2c3f86e82f8" exitCode=0 Mar 09 09:39:16 crc kubenswrapper[4792]: I0309 09:39:16.170044 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" event={"ID":"3169dde4-b88a-4a42-b22c-452ed9d2b945","Type":"ContainerDied","Data":"9a835914b405660246ab4fb83ec4285a3ed5ea8d17add8fc5583b2c3f86e82f8"} Mar 09 09:39:17 crc kubenswrapper[4792]: I0309 09:39:17.541500 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" Mar 09 09:39:17 crc kubenswrapper[4792]: I0309 09:39:17.639331 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbkfv\" (UniqueName: \"kubernetes.io/projected/3169dde4-b88a-4a42-b22c-452ed9d2b945-kube-api-access-vbkfv\") pod \"3169dde4-b88a-4a42-b22c-452ed9d2b945\" (UID: \"3169dde4-b88a-4a42-b22c-452ed9d2b945\") " Mar 09 09:39:17 crc kubenswrapper[4792]: I0309 09:39:17.639535 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3169dde4-b88a-4a42-b22c-452ed9d2b945-inventory\") pod \"3169dde4-b88a-4a42-b22c-452ed9d2b945\" (UID: \"3169dde4-b88a-4a42-b22c-452ed9d2b945\") " Mar 09 09:39:17 crc kubenswrapper[4792]: I0309 09:39:17.640396 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3169dde4-b88a-4a42-b22c-452ed9d2b945-ssh-key-openstack-edpm-ipam\") pod \"3169dde4-b88a-4a42-b22c-452ed9d2b945\" (UID: \"3169dde4-b88a-4a42-b22c-452ed9d2b945\") " Mar 09 09:39:17 crc kubenswrapper[4792]: I0309 09:39:17.644907 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3169dde4-b88a-4a42-b22c-452ed9d2b945-kube-api-access-vbkfv" (OuterVolumeSpecName: "kube-api-access-vbkfv") pod "3169dde4-b88a-4a42-b22c-452ed9d2b945" (UID: "3169dde4-b88a-4a42-b22c-452ed9d2b945"). InnerVolumeSpecName "kube-api-access-vbkfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:17 crc kubenswrapper[4792]: I0309 09:39:17.664044 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3169dde4-b88a-4a42-b22c-452ed9d2b945-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3169dde4-b88a-4a42-b22c-452ed9d2b945" (UID: "3169dde4-b88a-4a42-b22c-452ed9d2b945"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:17 crc kubenswrapper[4792]: I0309 09:39:17.679679 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3169dde4-b88a-4a42-b22c-452ed9d2b945-inventory" (OuterVolumeSpecName: "inventory") pod "3169dde4-b88a-4a42-b22c-452ed9d2b945" (UID: "3169dde4-b88a-4a42-b22c-452ed9d2b945"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:17 crc kubenswrapper[4792]: I0309 09:39:17.744455 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3169dde4-b88a-4a42-b22c-452ed9d2b945-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:17 crc kubenswrapper[4792]: I0309 09:39:17.744683 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbkfv\" (UniqueName: \"kubernetes.io/projected/3169dde4-b88a-4a42-b22c-452ed9d2b945-kube-api-access-vbkfv\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:17 crc kubenswrapper[4792]: I0309 09:39:17.744760 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3169dde4-b88a-4a42-b22c-452ed9d2b945-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:18 crc kubenswrapper[4792]: I0309 09:39:18.197965 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" event={"ID":"3169dde4-b88a-4a42-b22c-452ed9d2b945","Type":"ContainerDied","Data":"48e76db52b0eb6c1695d8a20be6eb82906f8e2f1a4aa7ecb3d5b2385ba1498d4"} Mar 09 09:39:18 crc kubenswrapper[4792]: I0309 09:39:18.198035 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48e76db52b0eb6c1695d8a20be6eb82906f8e2f1a4aa7ecb3d5b2385ba1498d4" Mar 09 09:39:18 crc kubenswrapper[4792]: I0309 09:39:18.198005 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x" Mar 09 09:39:22 crc kubenswrapper[4792]: I0309 09:39:22.053290 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xv7wk"] Mar 09 09:39:22 crc kubenswrapper[4792]: I0309 09:39:22.064846 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xv7wk"] Mar 09 09:39:23 crc kubenswrapper[4792]: I0309 09:39:23.676670 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5277744-0423-4554-8512-da2a35eaaafd" path="/var/lib/kubelet/pods/b5277744-0423-4554-8512-da2a35eaaafd/volumes" Mar 09 09:39:37 crc kubenswrapper[4792]: I0309 09:39:37.050467 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-622mt"] Mar 09 09:39:37 crc kubenswrapper[4792]: I0309 09:39:37.064624 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-622mt"] Mar 09 09:39:37 crc kubenswrapper[4792]: I0309 09:39:37.674766 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5f70ffc-88a1-470d-923c-78a1cc6aba7b" path="/var/lib/kubelet/pods/f5f70ffc-88a1-470d-923c-78a1cc6aba7b/volumes" Mar 09 09:39:38 crc kubenswrapper[4792]: I0309 09:39:38.402983 4792 scope.go:117] "RemoveContainer" containerID="4a974df632b929a8818d4b5a23dffeee8227f916cb8145de0821d944b5d816a5" Mar 09 09:39:38 crc kubenswrapper[4792]: I0309 09:39:38.458307 4792 scope.go:117] "RemoveContainer" containerID="55b4abdb4e5c3b239676c5bb57c8cabb00688f5febc4fdf58e595a1c329a41e7" Mar 09 09:39:38 crc kubenswrapper[4792]: I0309 09:39:38.497121 4792 scope.go:117] "RemoveContainer" containerID="cb03b28b5d07fd12acb7d4043a1662fbb06abb286134f85927a0d0cf4166e655" Mar 09 09:40:00 crc kubenswrapper[4792]: I0309 09:40:00.154599 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550820-lnfhn"] Mar 09 09:40:00 crc kubenswrapper[4792]: E0309 09:40:00.156177 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3169dde4-b88a-4a42-b22c-452ed9d2b945" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:40:00 crc kubenswrapper[4792]: I0309 09:40:00.156198 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3169dde4-b88a-4a42-b22c-452ed9d2b945" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:40:00 crc kubenswrapper[4792]: I0309 09:40:00.156417 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3169dde4-b88a-4a42-b22c-452ed9d2b945" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:40:00 crc kubenswrapper[4792]: I0309 09:40:00.158084 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550820-lnfhn" Mar 09 09:40:00 crc kubenswrapper[4792]: I0309 09:40:00.165828 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:40:00 crc kubenswrapper[4792]: I0309 09:40:00.165907 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:40:00 crc kubenswrapper[4792]: I0309 09:40:00.166093 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:40:00 crc kubenswrapper[4792]: I0309 09:40:00.175745 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550820-lnfhn"] Mar 09 09:40:00 crc kubenswrapper[4792]: I0309 09:40:00.228904 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gk6w\" (UniqueName: \"kubernetes.io/projected/ea6cb8b6-189a-41ff-aed8-95729f1ca01c-kube-api-access-5gk6w\") pod \"auto-csr-approver-29550820-lnfhn\" (UID: \"ea6cb8b6-189a-41ff-aed8-95729f1ca01c\") " pod="openshift-infra/auto-csr-approver-29550820-lnfhn" Mar 09 09:40:00 crc kubenswrapper[4792]: I0309 09:40:00.331135 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gk6w\" (UniqueName: \"kubernetes.io/projected/ea6cb8b6-189a-41ff-aed8-95729f1ca01c-kube-api-access-5gk6w\") pod \"auto-csr-approver-29550820-lnfhn\" (UID: \"ea6cb8b6-189a-41ff-aed8-95729f1ca01c\") " pod="openshift-infra/auto-csr-approver-29550820-lnfhn" Mar 09 09:40:00 crc kubenswrapper[4792]: I0309 09:40:00.357954 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gk6w\" (UniqueName: \"kubernetes.io/projected/ea6cb8b6-189a-41ff-aed8-95729f1ca01c-kube-api-access-5gk6w\") pod \"auto-csr-approver-29550820-lnfhn\" (UID: \"ea6cb8b6-189a-41ff-aed8-95729f1ca01c\") " pod="openshift-infra/auto-csr-approver-29550820-lnfhn" Mar 09 09:40:00 crc kubenswrapper[4792]: I0309 09:40:00.480835 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550820-lnfhn" Mar 09 09:40:00 crc kubenswrapper[4792]: I0309 09:40:00.909676 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550820-lnfhn"] Mar 09 09:40:01 crc kubenswrapper[4792]: I0309 09:40:01.560738 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550820-lnfhn" event={"ID":"ea6cb8b6-189a-41ff-aed8-95729f1ca01c","Type":"ContainerStarted","Data":"7ad4f930d18c102500bbc77b4fa693796fbf54cd311ab4a0f6f29995317e2a83"} Mar 09 09:40:02 crc kubenswrapper[4792]: I0309 09:40:02.568503 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550820-lnfhn" event={"ID":"ea6cb8b6-189a-41ff-aed8-95729f1ca01c","Type":"ContainerStarted","Data":"712c4d8c4bf174cf9a0a715557728bf065ae207a6fef40b6db706c5fae954134"} Mar 09 09:40:02 crc kubenswrapper[4792]: I0309 09:40:02.584525 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550820-lnfhn" podStartSLOduration=1.376457375 podStartE2EDuration="2.58450483s" podCreationTimestamp="2026-03-09 09:40:00 +0000 UTC" firstStartedPulling="2026-03-09 09:40:00.915544868 +0000 UTC m=+1965.945745620" lastFinishedPulling="2026-03-09 09:40:02.123592323 +0000 UTC m=+1967.153793075" observedRunningTime="2026-03-09 09:40:02.583546322 +0000 UTC m=+1967.613747074" watchObservedRunningTime="2026-03-09 09:40:02.58450483 +0000 UTC m=+1967.614705582" Mar 09 09:40:03 crc kubenswrapper[4792]: I0309 09:40:03.576486 4792 generic.go:334] "Generic (PLEG): container finished" podID="ea6cb8b6-189a-41ff-aed8-95729f1ca01c" containerID="712c4d8c4bf174cf9a0a715557728bf065ae207a6fef40b6db706c5fae954134" exitCode=0 Mar 09 09:40:03 crc kubenswrapper[4792]: I0309 09:40:03.576633 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550820-lnfhn" event={"ID":"ea6cb8b6-189a-41ff-aed8-95729f1ca01c","Type":"ContainerDied","Data":"712c4d8c4bf174cf9a0a715557728bf065ae207a6fef40b6db706c5fae954134"} Mar 09 09:40:04 crc kubenswrapper[4792]: I0309 09:40:04.898679 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550820-lnfhn" Mar 09 09:40:05 crc kubenswrapper[4792]: I0309 09:40:05.022265 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gk6w\" (UniqueName: \"kubernetes.io/projected/ea6cb8b6-189a-41ff-aed8-95729f1ca01c-kube-api-access-5gk6w\") pod \"ea6cb8b6-189a-41ff-aed8-95729f1ca01c\" (UID: \"ea6cb8b6-189a-41ff-aed8-95729f1ca01c\") " Mar 09 09:40:05 crc kubenswrapper[4792]: I0309 09:40:05.031032 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6cb8b6-189a-41ff-aed8-95729f1ca01c-kube-api-access-5gk6w" (OuterVolumeSpecName: "kube-api-access-5gk6w") pod "ea6cb8b6-189a-41ff-aed8-95729f1ca01c" (UID: "ea6cb8b6-189a-41ff-aed8-95729f1ca01c"). InnerVolumeSpecName "kube-api-access-5gk6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:40:05 crc kubenswrapper[4792]: I0309 09:40:05.128024 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gk6w\" (UniqueName: \"kubernetes.io/projected/ea6cb8b6-189a-41ff-aed8-95729f1ca01c-kube-api-access-5gk6w\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:05 crc kubenswrapper[4792]: I0309 09:40:05.624878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550820-lnfhn" event={"ID":"ea6cb8b6-189a-41ff-aed8-95729f1ca01c","Type":"ContainerDied","Data":"7ad4f930d18c102500bbc77b4fa693796fbf54cd311ab4a0f6f29995317e2a83"} Mar 09 09:40:05 crc kubenswrapper[4792]: I0309 09:40:05.624924 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ad4f930d18c102500bbc77b4fa693796fbf54cd311ab4a0f6f29995317e2a83" Mar 09 09:40:05 crc kubenswrapper[4792]: I0309 09:40:05.624942 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550820-lnfhn" Mar 09 09:40:05 crc kubenswrapper[4792]: I0309 09:40:05.672407 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550814-5vplj"] Mar 09 09:40:05 crc kubenswrapper[4792]: I0309 09:40:05.672443 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550814-5vplj"] Mar 09 09:40:06 crc kubenswrapper[4792]: I0309 09:40:06.030864 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dn7pc"] Mar 09 09:40:06 crc kubenswrapper[4792]: I0309 09:40:06.037201 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dn7pc"] Mar 09 09:40:07 crc kubenswrapper[4792]: I0309 09:40:07.677395 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afe8b396-f7ab-49c1-af21-5583ce0a3342" path="/var/lib/kubelet/pods/afe8b396-f7ab-49c1-af21-5583ce0a3342/volumes" Mar 09 09:40:07 crc kubenswrapper[4792]: I0309 09:40:07.678500 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1399ead-26d7-4512-bbf0-f004f5b95c70" path="/var/lib/kubelet/pods/f1399ead-26d7-4512-bbf0-f004f5b95c70/volumes" Mar 09 09:40:17 crc kubenswrapper[4792]: I0309 09:40:17.799679 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qqqgl"] Mar 09 09:40:17 crc kubenswrapper[4792]: E0309 09:40:17.800841 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6cb8b6-189a-41ff-aed8-95729f1ca01c" containerName="oc" Mar 09 09:40:17 crc kubenswrapper[4792]: I0309 09:40:17.800856 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6cb8b6-189a-41ff-aed8-95729f1ca01c" containerName="oc" Mar 09 09:40:17 crc kubenswrapper[4792]: I0309 09:40:17.801098 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6cb8b6-189a-41ff-aed8-95729f1ca01c" containerName="oc" Mar 09 09:40:17 crc kubenswrapper[4792]: I0309 09:40:17.806978 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:17 crc kubenswrapper[4792]: I0309 09:40:17.815133 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qqqgl"] Mar 09 09:40:17 crc kubenswrapper[4792]: I0309 09:40:17.981471 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xsbm\" (UniqueName: \"kubernetes.io/projected/fa90a757-6de5-4da7-876c-b4ed527602c6-kube-api-access-2xsbm\") pod \"redhat-operators-qqqgl\" (UID: \"fa90a757-6de5-4da7-876c-b4ed527602c6\") " pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:17 crc kubenswrapper[4792]: I0309 09:40:17.981529 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa90a757-6de5-4da7-876c-b4ed527602c6-catalog-content\") pod \"redhat-operators-qqqgl\" (UID: \"fa90a757-6de5-4da7-876c-b4ed527602c6\") " pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:17 crc kubenswrapper[4792]: I0309 09:40:17.981570 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa90a757-6de5-4da7-876c-b4ed527602c6-utilities\") pod \"redhat-operators-qqqgl\" (UID: \"fa90a757-6de5-4da7-876c-b4ed527602c6\") " pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:18 crc kubenswrapper[4792]: I0309 09:40:18.082904 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xsbm\" (UniqueName: \"kubernetes.io/projected/fa90a757-6de5-4da7-876c-b4ed527602c6-kube-api-access-2xsbm\") pod \"redhat-operators-qqqgl\" (UID: \"fa90a757-6de5-4da7-876c-b4ed527602c6\") " pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:18 crc kubenswrapper[4792]: I0309 09:40:18.082967 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa90a757-6de5-4da7-876c-b4ed527602c6-catalog-content\") pod \"redhat-operators-qqqgl\" (UID: \"fa90a757-6de5-4da7-876c-b4ed527602c6\") " pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:18 crc kubenswrapper[4792]: I0309 09:40:18.083012 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa90a757-6de5-4da7-876c-b4ed527602c6-utilities\") pod \"redhat-operators-qqqgl\" (UID: \"fa90a757-6de5-4da7-876c-b4ed527602c6\") " pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:18 crc kubenswrapper[4792]: I0309 09:40:18.084442 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa90a757-6de5-4da7-876c-b4ed527602c6-utilities\") pod \"redhat-operators-qqqgl\" (UID: \"fa90a757-6de5-4da7-876c-b4ed527602c6\") " pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:18 crc kubenswrapper[4792]: I0309 09:40:18.084485 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa90a757-6de5-4da7-876c-b4ed527602c6-catalog-content\") pod \"redhat-operators-qqqgl\" (UID: \"fa90a757-6de5-4da7-876c-b4ed527602c6\") " pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:18 crc kubenswrapper[4792]: I0309 09:40:18.105843 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xsbm\" (UniqueName: \"kubernetes.io/projected/fa90a757-6de5-4da7-876c-b4ed527602c6-kube-api-access-2xsbm\") pod \"redhat-operators-qqqgl\" (UID: \"fa90a757-6de5-4da7-876c-b4ed527602c6\") " pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:18 crc kubenswrapper[4792]: I0309 09:40:18.128340 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:18 crc kubenswrapper[4792]: I0309 09:40:18.593258 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qqqgl"] Mar 09 09:40:18 crc kubenswrapper[4792]: W0309 09:40:18.600440 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa90a757_6de5_4da7_876c_b4ed527602c6.slice/crio-45b0c997afe89c7a8c9809d89b17b37af556fb57f9c73dedfc23287f9255f5af WatchSource:0}: Error finding container 45b0c997afe89c7a8c9809d89b17b37af556fb57f9c73dedfc23287f9255f5af: Status 404 returned error can't find the container with id 45b0c997afe89c7a8c9809d89b17b37af556fb57f9c73dedfc23287f9255f5af Mar 09 09:40:18 crc kubenswrapper[4792]: I0309 09:40:18.781665 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqqgl" event={"ID":"fa90a757-6de5-4da7-876c-b4ed527602c6","Type":"ContainerStarted","Data":"45b0c997afe89c7a8c9809d89b17b37af556fb57f9c73dedfc23287f9255f5af"} Mar 09 09:40:19 crc kubenswrapper[4792]: I0309 09:40:19.793181 4792 generic.go:334] "Generic (PLEG): container finished" podID="fa90a757-6de5-4da7-876c-b4ed527602c6" containerID="ee66ff44186e06e73fffc6f094847d4c53018050fbe609e44f9c7badfc1beb48" exitCode=0 Mar 09 09:40:19 crc kubenswrapper[4792]: I0309 09:40:19.793248 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqqgl" event={"ID":"fa90a757-6de5-4da7-876c-b4ed527602c6","Type":"ContainerDied","Data":"ee66ff44186e06e73fffc6f094847d4c53018050fbe609e44f9c7badfc1beb48"} Mar 09 09:40:21 crc kubenswrapper[4792]: I0309 09:40:21.810179 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqqgl" event={"ID":"fa90a757-6de5-4da7-876c-b4ed527602c6","Type":"ContainerStarted","Data":"99621d0898b35df010dc5dd5153370dbb65a518cc5c1f546513d057c9bbc8b7f"} Mar 09 09:40:25 crc kubenswrapper[4792]: I0309 09:40:25.853759 4792 generic.go:334] "Generic (PLEG): container finished" podID="fa90a757-6de5-4da7-876c-b4ed527602c6" containerID="99621d0898b35df010dc5dd5153370dbb65a518cc5c1f546513d057c9bbc8b7f" exitCode=0 Mar 09 09:40:25 crc kubenswrapper[4792]: I0309 09:40:25.853828 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqqgl" event={"ID":"fa90a757-6de5-4da7-876c-b4ed527602c6","Type":"ContainerDied","Data":"99621d0898b35df010dc5dd5153370dbb65a518cc5c1f546513d057c9bbc8b7f"} Mar 09 09:40:26 crc kubenswrapper[4792]: I0309 09:40:26.863671 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqqgl" event={"ID":"fa90a757-6de5-4da7-876c-b4ed527602c6","Type":"ContainerStarted","Data":"2060b72bae44a41f8bc31a870035de90f8d16ad0697a44ee3535196e57ff53ee"} Mar 09 09:40:26 crc kubenswrapper[4792]: I0309 09:40:26.889462 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qqqgl" podStartSLOduration=3.328901292 podStartE2EDuration="9.889425306s" podCreationTimestamp="2026-03-09 09:40:17 +0000 UTC" firstStartedPulling="2026-03-09 09:40:19.795405462 +0000 UTC m=+1984.825606214" lastFinishedPulling="2026-03-09 09:40:26.355929476 +0000 UTC m=+1991.386130228" observedRunningTime="2026-03-09 09:40:26.881876557 +0000 UTC m=+1991.912077329" watchObservedRunningTime="2026-03-09 09:40:26.889425306 +0000 UTC m=+1991.919626058" Mar 09 09:40:28 crc kubenswrapper[4792]: I0309 09:40:28.129044 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:28 crc kubenswrapper[4792]: I0309 09:40:28.129382 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:29 crc kubenswrapper[4792]: I0309 09:40:29.171400 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qqqgl" podUID="fa90a757-6de5-4da7-876c-b4ed527602c6" containerName="registry-server" probeResult="failure" output=< Mar 09 09:40:29 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 09:40:29 crc kubenswrapper[4792]: > Mar 09 09:40:38 crc kubenswrapper[4792]: I0309 09:40:38.200179 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:38 crc kubenswrapper[4792]: I0309 09:40:38.265125 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:38 crc kubenswrapper[4792]: I0309 09:40:38.452553 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qqqgl"] Mar 09 09:40:38 crc kubenswrapper[4792]: I0309 09:40:38.594033 4792 scope.go:117] "RemoveContainer" containerID="833acc7a1c95bf382ddc6bf917ea00d42caa207f55f8a92ccc8390d630b3e9a2" Mar 09 09:40:38 crc kubenswrapper[4792]: I0309 09:40:38.636499 4792 scope.go:117] "RemoveContainer" containerID="5b97c962d6c8dfe919ecfc90faca3416e3dd5be66b006567cd5445fa68fc9ed2" Mar 09 09:40:39 crc kubenswrapper[4792]: I0309 09:40:39.965747 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qqqgl" podUID="fa90a757-6de5-4da7-876c-b4ed527602c6" containerName="registry-server" containerID="cri-o://2060b72bae44a41f8bc31a870035de90f8d16ad0697a44ee3535196e57ff53ee" gracePeriod=2 Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.389656 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.590874 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xsbm\" (UniqueName: \"kubernetes.io/projected/fa90a757-6de5-4da7-876c-b4ed527602c6-kube-api-access-2xsbm\") pod \"fa90a757-6de5-4da7-876c-b4ed527602c6\" (UID: \"fa90a757-6de5-4da7-876c-b4ed527602c6\") " Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.591241 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa90a757-6de5-4da7-876c-b4ed527602c6-catalog-content\") pod \"fa90a757-6de5-4da7-876c-b4ed527602c6\" (UID: \"fa90a757-6de5-4da7-876c-b4ed527602c6\") " Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.591376 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa90a757-6de5-4da7-876c-b4ed527602c6-utilities\") pod \"fa90a757-6de5-4da7-876c-b4ed527602c6\" (UID: \"fa90a757-6de5-4da7-876c-b4ed527602c6\") " Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.592561 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa90a757-6de5-4da7-876c-b4ed527602c6-utilities" (OuterVolumeSpecName: "utilities") pod "fa90a757-6de5-4da7-876c-b4ed527602c6" (UID: "fa90a757-6de5-4da7-876c-b4ed527602c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.597712 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa90a757-6de5-4da7-876c-b4ed527602c6-kube-api-access-2xsbm" (OuterVolumeSpecName: "kube-api-access-2xsbm") pod "fa90a757-6de5-4da7-876c-b4ed527602c6" (UID: "fa90a757-6de5-4da7-876c-b4ed527602c6"). InnerVolumeSpecName "kube-api-access-2xsbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.694691 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xsbm\" (UniqueName: \"kubernetes.io/projected/fa90a757-6de5-4da7-876c-b4ed527602c6-kube-api-access-2xsbm\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.694734 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa90a757-6de5-4da7-876c-b4ed527602c6-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.742716 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa90a757-6de5-4da7-876c-b4ed527602c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa90a757-6de5-4da7-876c-b4ed527602c6" (UID: "fa90a757-6de5-4da7-876c-b4ed527602c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.795711 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa90a757-6de5-4da7-876c-b4ed527602c6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.974058 4792 generic.go:334] "Generic (PLEG): container finished" podID="fa90a757-6de5-4da7-876c-b4ed527602c6" containerID="2060b72bae44a41f8bc31a870035de90f8d16ad0697a44ee3535196e57ff53ee" exitCode=0 Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.974113 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqqgl" event={"ID":"fa90a757-6de5-4da7-876c-b4ed527602c6","Type":"ContainerDied","Data":"2060b72bae44a41f8bc31a870035de90f8d16ad0697a44ee3535196e57ff53ee"} Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.974146 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qqqgl" event={"ID":"fa90a757-6de5-4da7-876c-b4ed527602c6","Type":"ContainerDied","Data":"45b0c997afe89c7a8c9809d89b17b37af556fb57f9c73dedfc23287f9255f5af"} Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.974165 4792 scope.go:117] "RemoveContainer" containerID="2060b72bae44a41f8bc31a870035de90f8d16ad0697a44ee3535196e57ff53ee" Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.974173 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qqqgl" Mar 09 09:40:40 crc kubenswrapper[4792]: I0309 09:40:40.989494 4792 scope.go:117] "RemoveContainer" containerID="99621d0898b35df010dc5dd5153370dbb65a518cc5c1f546513d057c9bbc8b7f" Mar 09 09:40:41 crc kubenswrapper[4792]: I0309 09:40:41.016768 4792 scope.go:117] "RemoveContainer" containerID="ee66ff44186e06e73fffc6f094847d4c53018050fbe609e44f9c7badfc1beb48" Mar 09 09:40:41 crc kubenswrapper[4792]: I0309 09:40:41.016787 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qqqgl"] Mar 09 09:40:41 crc kubenswrapper[4792]: I0309 09:40:41.020929 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qqqgl"] Mar 09 09:40:41 crc kubenswrapper[4792]: I0309 09:40:41.075020 4792 scope.go:117] "RemoveContainer" containerID="2060b72bae44a41f8bc31a870035de90f8d16ad0697a44ee3535196e57ff53ee" Mar 09 09:40:41 crc kubenswrapper[4792]: E0309 09:40:41.075470 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2060b72bae44a41f8bc31a870035de90f8d16ad0697a44ee3535196e57ff53ee\": container with ID starting with 2060b72bae44a41f8bc31a870035de90f8d16ad0697a44ee3535196e57ff53ee not found: ID does not exist" containerID="2060b72bae44a41f8bc31a870035de90f8d16ad0697a44ee3535196e57ff53ee" Mar 09 09:40:41 crc kubenswrapper[4792]: I0309 09:40:41.075502 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2060b72bae44a41f8bc31a870035de90f8d16ad0697a44ee3535196e57ff53ee"} err="failed to get container status \"2060b72bae44a41f8bc31a870035de90f8d16ad0697a44ee3535196e57ff53ee\": rpc error: code = NotFound desc = could not find container \"2060b72bae44a41f8bc31a870035de90f8d16ad0697a44ee3535196e57ff53ee\": container with ID starting with 2060b72bae44a41f8bc31a870035de90f8d16ad0697a44ee3535196e57ff53ee not found: ID does not exist" Mar 09 09:40:41 crc kubenswrapper[4792]: I0309 09:40:41.075525 4792 scope.go:117] "RemoveContainer" containerID="99621d0898b35df010dc5dd5153370dbb65a518cc5c1f546513d057c9bbc8b7f" Mar 09 09:40:41 crc kubenswrapper[4792]: E0309 09:40:41.076283 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99621d0898b35df010dc5dd5153370dbb65a518cc5c1f546513d057c9bbc8b7f\": container with ID starting with 99621d0898b35df010dc5dd5153370dbb65a518cc5c1f546513d057c9bbc8b7f not found: ID does not exist" containerID="99621d0898b35df010dc5dd5153370dbb65a518cc5c1f546513d057c9bbc8b7f" Mar 09 09:40:41 crc kubenswrapper[4792]: I0309 09:40:41.076346 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99621d0898b35df010dc5dd5153370dbb65a518cc5c1f546513d057c9bbc8b7f"} err="failed to get container status \"99621d0898b35df010dc5dd5153370dbb65a518cc5c1f546513d057c9bbc8b7f\": rpc error: code = NotFound desc = could not find container \"99621d0898b35df010dc5dd5153370dbb65a518cc5c1f546513d057c9bbc8b7f\": container with ID starting with 99621d0898b35df010dc5dd5153370dbb65a518cc5c1f546513d057c9bbc8b7f not found: ID does not exist" Mar 09 09:40:41 crc kubenswrapper[4792]: I0309 09:40:41.076373 4792 scope.go:117] "RemoveContainer" containerID="ee66ff44186e06e73fffc6f094847d4c53018050fbe609e44f9c7badfc1beb48" Mar 09 09:40:41 crc kubenswrapper[4792]: E0309 09:40:41.076757 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee66ff44186e06e73fffc6f094847d4c53018050fbe609e44f9c7badfc1beb48\": container with ID starting with ee66ff44186e06e73fffc6f094847d4c53018050fbe609e44f9c7badfc1beb48 not found: ID does not exist" containerID="ee66ff44186e06e73fffc6f094847d4c53018050fbe609e44f9c7badfc1beb48" Mar 09 09:40:41 crc kubenswrapper[4792]: I0309 09:40:41.076779 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee66ff44186e06e73fffc6f094847d4c53018050fbe609e44f9c7badfc1beb48"} err="failed to get container status \"ee66ff44186e06e73fffc6f094847d4c53018050fbe609e44f9c7badfc1beb48\": rpc error: code = NotFound desc = could not find container \"ee66ff44186e06e73fffc6f094847d4c53018050fbe609e44f9c7badfc1beb48\": container with ID starting with ee66ff44186e06e73fffc6f094847d4c53018050fbe609e44f9c7badfc1beb48 not found: ID does not exist" Mar 09 09:40:41 crc kubenswrapper[4792]: I0309 09:40:41.672627 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa90a757-6de5-4da7-876c-b4ed527602c6" path="/var/lib/kubelet/pods/fa90a757-6de5-4da7-876c-b4ed527602c6/volumes" Mar 09 09:41:13 crc kubenswrapper[4792]: I0309 09:41:13.213983 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:41:13 crc kubenswrapper[4792]: I0309 09:41:13.214584 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:41:43 crc kubenswrapper[4792]: I0309 09:41:43.214920 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:41:43 crc kubenswrapper[4792]: I0309 09:41:43.215530 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.144483 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550822-57bjc"] Mar 09 09:42:00 crc kubenswrapper[4792]: E0309 09:42:00.146957 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa90a757-6de5-4da7-876c-b4ed527602c6" containerName="extract-utilities" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.147092 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa90a757-6de5-4da7-876c-b4ed527602c6" containerName="extract-utilities" Mar 09 09:42:00 crc kubenswrapper[4792]: E0309 09:42:00.147190 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa90a757-6de5-4da7-876c-b4ed527602c6" containerName="extract-content" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.147262 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa90a757-6de5-4da7-876c-b4ed527602c6" containerName="extract-content" Mar 09 09:42:00 crc kubenswrapper[4792]: E0309 09:42:00.147349 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa90a757-6de5-4da7-876c-b4ed527602c6" containerName="registry-server" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.147425 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa90a757-6de5-4da7-876c-b4ed527602c6" containerName="registry-server" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.147715 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa90a757-6de5-4da7-876c-b4ed527602c6" containerName="registry-server" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.148543 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550822-57bjc" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.151562 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.151722 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.153871 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550822-57bjc"] Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.155314 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.250466 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjcq5\" (UniqueName: \"kubernetes.io/projected/91a30492-5834-4ceb-88c2-24fb04d95752-kube-api-access-bjcq5\") pod \"auto-csr-approver-29550822-57bjc\" (UID: \"91a30492-5834-4ceb-88c2-24fb04d95752\") " pod="openshift-infra/auto-csr-approver-29550822-57bjc" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.353102 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjcq5\" (UniqueName: \"kubernetes.io/projected/91a30492-5834-4ceb-88c2-24fb04d95752-kube-api-access-bjcq5\") pod \"auto-csr-approver-29550822-57bjc\" (UID: \"91a30492-5834-4ceb-88c2-24fb04d95752\") " pod="openshift-infra/auto-csr-approver-29550822-57bjc" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.374896 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjcq5\" (UniqueName: \"kubernetes.io/projected/91a30492-5834-4ceb-88c2-24fb04d95752-kube-api-access-bjcq5\") pod \"auto-csr-approver-29550822-57bjc\" (UID: \"91a30492-5834-4ceb-88c2-24fb04d95752\") " pod="openshift-infra/auto-csr-approver-29550822-57bjc" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.472809 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550822-57bjc" Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.891503 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550822-57bjc"] Mar 09 09:42:00 crc kubenswrapper[4792]: I0309 09:42:00.905060 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:42:01 crc kubenswrapper[4792]: I0309 09:42:01.613698 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550822-57bjc" event={"ID":"91a30492-5834-4ceb-88c2-24fb04d95752","Type":"ContainerStarted","Data":"4d19c16b1a9da8393b9035863c4bc9da747dcb8b3fb6df25dae81b8bbf99e87f"} Mar 09 09:42:02 crc kubenswrapper[4792]: I0309 09:42:02.622804 4792 generic.go:334] "Generic (PLEG): container finished" podID="91a30492-5834-4ceb-88c2-24fb04d95752" containerID="2bbfbd384276a37a6f469736a325d0cd09b308faf21b64aff8876d3792f96374" exitCode=0 Mar 09 09:42:02 crc kubenswrapper[4792]: I0309 09:42:02.622846 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550822-57bjc" event={"ID":"91a30492-5834-4ceb-88c2-24fb04d95752","Type":"ContainerDied","Data":"2bbfbd384276a37a6f469736a325d0cd09b308faf21b64aff8876d3792f96374"} Mar 09 09:42:03 crc kubenswrapper[4792]: I0309 09:42:03.924684 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550822-57bjc" Mar 09 09:42:04 crc kubenswrapper[4792]: I0309 09:42:04.115099 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjcq5\" (UniqueName: \"kubernetes.io/projected/91a30492-5834-4ceb-88c2-24fb04d95752-kube-api-access-bjcq5\") pod \"91a30492-5834-4ceb-88c2-24fb04d95752\" (UID: \"91a30492-5834-4ceb-88c2-24fb04d95752\") " Mar 09 09:42:04 crc kubenswrapper[4792]: I0309 09:42:04.120010 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a30492-5834-4ceb-88c2-24fb04d95752-kube-api-access-bjcq5" (OuterVolumeSpecName: "kube-api-access-bjcq5") pod "91a30492-5834-4ceb-88c2-24fb04d95752" (UID: "91a30492-5834-4ceb-88c2-24fb04d95752"). InnerVolumeSpecName "kube-api-access-bjcq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:42:04 crc kubenswrapper[4792]: I0309 09:42:04.217593 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjcq5\" (UniqueName: \"kubernetes.io/projected/91a30492-5834-4ceb-88c2-24fb04d95752-kube-api-access-bjcq5\") on node \"crc\" DevicePath \"\"" Mar 09 09:42:04 crc kubenswrapper[4792]: I0309 09:42:04.639201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550822-57bjc" event={"ID":"91a30492-5834-4ceb-88c2-24fb04d95752","Type":"ContainerDied","Data":"4d19c16b1a9da8393b9035863c4bc9da747dcb8b3fb6df25dae81b8bbf99e87f"} Mar 09 09:42:04 crc kubenswrapper[4792]: I0309 09:42:04.639288 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d19c16b1a9da8393b9035863c4bc9da747dcb8b3fb6df25dae81b8bbf99e87f" Mar 09 09:42:04 crc kubenswrapper[4792]: I0309 09:42:04.639348 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550822-57bjc" Mar 09 09:42:04 crc kubenswrapper[4792]: E0309 09:42:04.777194 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a30492_5834_4ceb_88c2_24fb04d95752.slice\": RecentStats: unable to find data in memory cache]" Mar 09 09:42:04 crc kubenswrapper[4792]: I0309 09:42:04.991692 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550816-4qnsg"] Mar 09 09:42:05 crc kubenswrapper[4792]: I0309 09:42:05.001662 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550816-4qnsg"] Mar 09 09:42:05 crc kubenswrapper[4792]: I0309 09:42:05.676961 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d07d2f67-727f-4c79-85c1-f24a90dd73e5" path="/var/lib/kubelet/pods/d07d2f67-727f-4c79-85c1-f24a90dd73e5/volumes" Mar 09 09:42:13 crc kubenswrapper[4792]: I0309 09:42:13.213772 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:42:13 crc kubenswrapper[4792]: I0309 09:42:13.214305 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:42:13 crc kubenswrapper[4792]: I0309 09:42:13.214352 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:42:13 crc kubenswrapper[4792]: I0309 09:42:13.215061 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f5e8f72786750c34390c3921b89a9937cb9dce363f8ebe7772b408843cd4da4"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:42:13 crc kubenswrapper[4792]: I0309 09:42:13.215139 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://6f5e8f72786750c34390c3921b89a9937cb9dce363f8ebe7772b408843cd4da4" gracePeriod=600 Mar 09 09:42:13 crc kubenswrapper[4792]: I0309 09:42:13.738975 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="6f5e8f72786750c34390c3921b89a9937cb9dce363f8ebe7772b408843cd4da4" exitCode=0 Mar 09 09:42:13 crc kubenswrapper[4792]: I0309 09:42:13.739060 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"6f5e8f72786750c34390c3921b89a9937cb9dce363f8ebe7772b408843cd4da4"} Mar 09 09:42:13 crc kubenswrapper[4792]: I0309 09:42:13.739583 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368"} Mar 09 09:42:13 crc kubenswrapper[4792]: I0309 09:42:13.739606 4792 scope.go:117] "RemoveContainer" containerID="f7dc0a0360eecc4b3d3577ec00ef1a00d7c8c9a5f8b5910ab6ece3541b62af92" Mar 09 09:42:38 crc kubenswrapper[4792]: I0309 09:42:38.769573 4792 scope.go:117] "RemoveContainer" containerID="dabbbeb457dc928eb9c1c64fecfffb01b1a1347ba11a1fc7130704452148f72d" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.289059 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8wfq2"] Mar 09 09:42:40 crc kubenswrapper[4792]: E0309 09:42:40.289739 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a30492-5834-4ceb-88c2-24fb04d95752" containerName="oc" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.289757 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a30492-5834-4ceb-88c2-24fb04d95752" containerName="oc" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.290023 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="91a30492-5834-4ceb-88c2-24fb04d95752" containerName="oc" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.291643 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.303332 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8wfq2"] Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.402545 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a80ee8-88bf-4139-a8e8-bf9e027fd396-utilities\") pod \"certified-operators-8wfq2\" (UID: \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\") " pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.402757 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a80ee8-88bf-4139-a8e8-bf9e027fd396-catalog-content\") pod \"certified-operators-8wfq2\" (UID: \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\") " pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.402786 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9ztj\" (UniqueName: \"kubernetes.io/projected/30a80ee8-88bf-4139-a8e8-bf9e027fd396-kube-api-access-t9ztj\") pod \"certified-operators-8wfq2\" (UID: \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\") " pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.504518 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a80ee8-88bf-4139-a8e8-bf9e027fd396-utilities\") pod \"certified-operators-8wfq2\" (UID: \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\") " pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.504614 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a80ee8-88bf-4139-a8e8-bf9e027fd396-catalog-content\") pod \"certified-operators-8wfq2\" (UID: \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\") " pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.504634 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9ztj\" (UniqueName: \"kubernetes.io/projected/30a80ee8-88bf-4139-a8e8-bf9e027fd396-kube-api-access-t9ztj\") pod \"certified-operators-8wfq2\" (UID: \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\") " pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.505113 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a80ee8-88bf-4139-a8e8-bf9e027fd396-utilities\") pod \"certified-operators-8wfq2\" (UID: \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\") " pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.505190 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a80ee8-88bf-4139-a8e8-bf9e027fd396-catalog-content\") pod \"certified-operators-8wfq2\" (UID: \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\") " pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.524784 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9ztj\" (UniqueName: \"kubernetes.io/projected/30a80ee8-88bf-4139-a8e8-bf9e027fd396-kube-api-access-t9ztj\") pod \"certified-operators-8wfq2\" (UID: \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\") " pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:40 crc kubenswrapper[4792]: I0309 09:42:40.609928 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:41 crc kubenswrapper[4792]: I0309 09:42:41.205266 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8wfq2"] Mar 09 09:42:42 crc kubenswrapper[4792]: I0309 09:42:42.005280 4792 generic.go:334] "Generic (PLEG): container finished" podID="30a80ee8-88bf-4139-a8e8-bf9e027fd396" containerID="8fe6bfb6915cd3c2cf00bfde8acec8261d78d667f36297e4d1c39f3b0e005990" exitCode=0 Mar 09 09:42:42 crc kubenswrapper[4792]: I0309 09:42:42.005663 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wfq2" event={"ID":"30a80ee8-88bf-4139-a8e8-bf9e027fd396","Type":"ContainerDied","Data":"8fe6bfb6915cd3c2cf00bfde8acec8261d78d667f36297e4d1c39f3b0e005990"} Mar 09 09:42:42 crc kubenswrapper[4792]: I0309 09:42:42.005698 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wfq2" event={"ID":"30a80ee8-88bf-4139-a8e8-bf9e027fd396","Type":"ContainerStarted","Data":"afbd3beb1ed1b6e5d8ff3b670996d8344d04331e57923e7532c5c9fc6ce4981e"} Mar 09 09:42:44 crc kubenswrapper[4792]: I0309 09:42:44.023771 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wfq2" event={"ID":"30a80ee8-88bf-4139-a8e8-bf9e027fd396","Type":"ContainerStarted","Data":"ba7ba386a0d73e49a92ce28eca011dd30d460a3d28a56e6c7fd754d6ddb09a31"} Mar 09 09:42:46 crc kubenswrapper[4792]: I0309 09:42:46.047438 4792 generic.go:334] "Generic (PLEG): container finished" podID="30a80ee8-88bf-4139-a8e8-bf9e027fd396" containerID="ba7ba386a0d73e49a92ce28eca011dd30d460a3d28a56e6c7fd754d6ddb09a31" exitCode=0 Mar 09 09:42:46 crc kubenswrapper[4792]: I0309 09:42:46.047514 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wfq2" event={"ID":"30a80ee8-88bf-4139-a8e8-bf9e027fd396","Type":"ContainerDied","Data":"ba7ba386a0d73e49a92ce28eca011dd30d460a3d28a56e6c7fd754d6ddb09a31"} Mar 09 09:42:47 crc kubenswrapper[4792]: I0309 09:42:47.060147 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wfq2" event={"ID":"30a80ee8-88bf-4139-a8e8-bf9e027fd396","Type":"ContainerStarted","Data":"46dda2164aa0ecb6c3d3aafcc3123f68d8df94f574051061e065593e0f1a9aa7"} Mar 09 09:42:47 crc kubenswrapper[4792]: I0309 09:42:47.085536 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8wfq2" podStartSLOduration=2.630566247 podStartE2EDuration="7.085516404s" podCreationTimestamp="2026-03-09 09:42:40 +0000 UTC" firstStartedPulling="2026-03-09 09:42:42.008177346 +0000 UTC m=+2127.038378098" lastFinishedPulling="2026-03-09 09:42:46.463127503 +0000 UTC m=+2131.493328255" observedRunningTime="2026-03-09 09:42:47.082034874 +0000 UTC m=+2132.112235656" watchObservedRunningTime="2026-03-09 09:42:47.085516404 +0000 UTC m=+2132.115717156" Mar 09 09:42:50 crc kubenswrapper[4792]: I0309 09:42:50.610286 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:50 crc kubenswrapper[4792]: I0309 09:42:50.611371 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:50 crc kubenswrapper[4792]: I0309 09:42:50.660684 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:51 crc kubenswrapper[4792]: I0309 09:42:51.134327 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:51 crc kubenswrapper[4792]: I0309 09:42:51.181779 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8wfq2"] Mar 09 09:42:53 crc kubenswrapper[4792]: I0309 09:42:53.104323 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8wfq2" podUID="30a80ee8-88bf-4139-a8e8-bf9e027fd396" containerName="registry-server" containerID="cri-o://46dda2164aa0ecb6c3d3aafcc3123f68d8df94f574051061e065593e0f1a9aa7" gracePeriod=2 Mar 09 09:42:53 crc kubenswrapper[4792]: I0309 09:42:53.719245 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:53 crc kubenswrapper[4792]: I0309 09:42:53.893180 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9ztj\" (UniqueName: \"kubernetes.io/projected/30a80ee8-88bf-4139-a8e8-bf9e027fd396-kube-api-access-t9ztj\") pod \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\" (UID: \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\") " Mar 09 09:42:53 crc kubenswrapper[4792]: I0309 09:42:53.893528 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a80ee8-88bf-4139-a8e8-bf9e027fd396-catalog-content\") pod \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\" (UID: \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\") " Mar 09 09:42:53 crc kubenswrapper[4792]: I0309 09:42:53.893692 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a80ee8-88bf-4139-a8e8-bf9e027fd396-utilities\") pod \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\" (UID: \"30a80ee8-88bf-4139-a8e8-bf9e027fd396\") " Mar 09 09:42:53 crc kubenswrapper[4792]: I0309 09:42:53.894918 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30a80ee8-88bf-4139-a8e8-bf9e027fd396-utilities" (OuterVolumeSpecName: "utilities") pod "30a80ee8-88bf-4139-a8e8-bf9e027fd396" (UID: "30a80ee8-88bf-4139-a8e8-bf9e027fd396"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:42:53 crc kubenswrapper[4792]: I0309 09:42:53.900833 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a80ee8-88bf-4139-a8e8-bf9e027fd396-kube-api-access-t9ztj" (OuterVolumeSpecName: "kube-api-access-t9ztj") pod "30a80ee8-88bf-4139-a8e8-bf9e027fd396" (UID: "30a80ee8-88bf-4139-a8e8-bf9e027fd396"). InnerVolumeSpecName "kube-api-access-t9ztj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:42:53 crc kubenswrapper[4792]: I0309 09:42:53.957865 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30a80ee8-88bf-4139-a8e8-bf9e027fd396-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30a80ee8-88bf-4139-a8e8-bf9e027fd396" (UID: "30a80ee8-88bf-4139-a8e8-bf9e027fd396"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:42:53 crc kubenswrapper[4792]: I0309 09:42:53.995885 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30a80ee8-88bf-4139-a8e8-bf9e027fd396-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:42:53 crc kubenswrapper[4792]: I0309 09:42:53.995920 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30a80ee8-88bf-4139-a8e8-bf9e027fd396-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:42:53 crc kubenswrapper[4792]: I0309 09:42:53.995930 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9ztj\" (UniqueName: \"kubernetes.io/projected/30a80ee8-88bf-4139-a8e8-bf9e027fd396-kube-api-access-t9ztj\") on node \"crc\" DevicePath \"\"" Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.115296 4792 generic.go:334] "Generic (PLEG): container finished" podID="30a80ee8-88bf-4139-a8e8-bf9e027fd396" containerID="46dda2164aa0ecb6c3d3aafcc3123f68d8df94f574051061e065593e0f1a9aa7" exitCode=0 Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.115345 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wfq2" event={"ID":"30a80ee8-88bf-4139-a8e8-bf9e027fd396","Type":"ContainerDied","Data":"46dda2164aa0ecb6c3d3aafcc3123f68d8df94f574051061e065593e0f1a9aa7"} Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.115378 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wfq2" event={"ID":"30a80ee8-88bf-4139-a8e8-bf9e027fd396","Type":"ContainerDied","Data":"afbd3beb1ed1b6e5d8ff3b670996d8344d04331e57923e7532c5c9fc6ce4981e"} Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.115401 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8wfq2" Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.115411 4792 scope.go:117] "RemoveContainer" containerID="46dda2164aa0ecb6c3d3aafcc3123f68d8df94f574051061e065593e0f1a9aa7" Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.135543 4792 scope.go:117] "RemoveContainer" containerID="ba7ba386a0d73e49a92ce28eca011dd30d460a3d28a56e6c7fd754d6ddb09a31" Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.151036 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8wfq2"] Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.162528 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8wfq2"] Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.181229 4792 scope.go:117] "RemoveContainer" containerID="8fe6bfb6915cd3c2cf00bfde8acec8261d78d667f36297e4d1c39f3b0e005990" Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.208096 4792 scope.go:117] "RemoveContainer" containerID="46dda2164aa0ecb6c3d3aafcc3123f68d8df94f574051061e065593e0f1a9aa7" Mar 09 09:42:54 crc kubenswrapper[4792]: E0309 09:42:54.208526 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46dda2164aa0ecb6c3d3aafcc3123f68d8df94f574051061e065593e0f1a9aa7\": container with ID starting with 46dda2164aa0ecb6c3d3aafcc3123f68d8df94f574051061e065593e0f1a9aa7 not found: ID does not exist" containerID="46dda2164aa0ecb6c3d3aafcc3123f68d8df94f574051061e065593e0f1a9aa7" Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.208560 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46dda2164aa0ecb6c3d3aafcc3123f68d8df94f574051061e065593e0f1a9aa7"} err="failed to get container status \"46dda2164aa0ecb6c3d3aafcc3123f68d8df94f574051061e065593e0f1a9aa7\": rpc error: code = NotFound desc = could not find container \"46dda2164aa0ecb6c3d3aafcc3123f68d8df94f574051061e065593e0f1a9aa7\": container with ID starting with 46dda2164aa0ecb6c3d3aafcc3123f68d8df94f574051061e065593e0f1a9aa7 not found: ID does not exist" Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.208581 4792 scope.go:117] "RemoveContainer" containerID="ba7ba386a0d73e49a92ce28eca011dd30d460a3d28a56e6c7fd754d6ddb09a31" Mar 09 09:42:54 crc kubenswrapper[4792]: E0309 09:42:54.208886 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba7ba386a0d73e49a92ce28eca011dd30d460a3d28a56e6c7fd754d6ddb09a31\": container with ID starting with ba7ba386a0d73e49a92ce28eca011dd30d460a3d28a56e6c7fd754d6ddb09a31 not found: ID does not exist" containerID="ba7ba386a0d73e49a92ce28eca011dd30d460a3d28a56e6c7fd754d6ddb09a31" Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.208927 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba7ba386a0d73e49a92ce28eca011dd30d460a3d28a56e6c7fd754d6ddb09a31"} err="failed to get container status \"ba7ba386a0d73e49a92ce28eca011dd30d460a3d28a56e6c7fd754d6ddb09a31\": rpc error: code = NotFound desc = could not find container \"ba7ba386a0d73e49a92ce28eca011dd30d460a3d28a56e6c7fd754d6ddb09a31\": container with ID starting with ba7ba386a0d73e49a92ce28eca011dd30d460a3d28a56e6c7fd754d6ddb09a31 not found: ID does not exist" Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.208953 4792 scope.go:117] "RemoveContainer" containerID="8fe6bfb6915cd3c2cf00bfde8acec8261d78d667f36297e4d1c39f3b0e005990" Mar 09 09:42:54 crc kubenswrapper[4792]: E0309 09:42:54.209418 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe6bfb6915cd3c2cf00bfde8acec8261d78d667f36297e4d1c39f3b0e005990\": container with ID starting with 8fe6bfb6915cd3c2cf00bfde8acec8261d78d667f36297e4d1c39f3b0e005990 not found: ID does not exist" containerID="8fe6bfb6915cd3c2cf00bfde8acec8261d78d667f36297e4d1c39f3b0e005990" Mar 09 09:42:54 crc kubenswrapper[4792]: I0309 09:42:54.209450 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe6bfb6915cd3c2cf00bfde8acec8261d78d667f36297e4d1c39f3b0e005990"} err="failed to get container status \"8fe6bfb6915cd3c2cf00bfde8acec8261d78d667f36297e4d1c39f3b0e005990\": rpc error: code = NotFound desc = could not find container \"8fe6bfb6915cd3c2cf00bfde8acec8261d78d667f36297e4d1c39f3b0e005990\": container with ID starting with 8fe6bfb6915cd3c2cf00bfde8acec8261d78d667f36297e4d1c39f3b0e005990 not found: ID does not exist" Mar 09 09:42:55 crc kubenswrapper[4792]: I0309 09:42:55.671232 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30a80ee8-88bf-4139-a8e8-bf9e027fd396" path="/var/lib/kubelet/pods/30a80ee8-88bf-4139-a8e8-bf9e027fd396/volumes" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.538843 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-545sv"] Mar 09 09:43:00 crc kubenswrapper[4792]: E0309 09:43:00.539731 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a80ee8-88bf-4139-a8e8-bf9e027fd396" containerName="extract-content" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.539745 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a80ee8-88bf-4139-a8e8-bf9e027fd396" containerName="extract-content" Mar 09 09:43:00 crc kubenswrapper[4792]: E0309 09:43:00.539772 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a80ee8-88bf-4139-a8e8-bf9e027fd396" containerName="registry-server" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.539778 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a80ee8-88bf-4139-a8e8-bf9e027fd396" containerName="registry-server" Mar 09 09:43:00 crc kubenswrapper[4792]: E0309 09:43:00.539792 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a80ee8-88bf-4139-a8e8-bf9e027fd396" containerName="extract-utilities" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.539798 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a80ee8-88bf-4139-a8e8-bf9e027fd396" containerName="extract-utilities" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.539989 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a80ee8-88bf-4139-a8e8-bf9e027fd396" containerName="registry-server" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.541157 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.549046 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-545sv"] Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.714030 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00fdfe8-7ba5-419c-8329-78f8fae238c5-utilities\") pod \"redhat-marketplace-545sv\" (UID: \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\") " pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.714176 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00fdfe8-7ba5-419c-8329-78f8fae238c5-catalog-content\") pod \"redhat-marketplace-545sv\" (UID: \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\") " pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.714206 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b89gm\" (UniqueName: \"kubernetes.io/projected/e00fdfe8-7ba5-419c-8329-78f8fae238c5-kube-api-access-b89gm\") pod \"redhat-marketplace-545sv\" (UID: \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\") " pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.815960 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00fdfe8-7ba5-419c-8329-78f8fae238c5-catalog-content\") pod \"redhat-marketplace-545sv\" (UID: \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\") " pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.816010 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b89gm\" (UniqueName: \"kubernetes.io/projected/e00fdfe8-7ba5-419c-8329-78f8fae238c5-kube-api-access-b89gm\") pod \"redhat-marketplace-545sv\" (UID: \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\") " pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.816082 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00fdfe8-7ba5-419c-8329-78f8fae238c5-utilities\") pod \"redhat-marketplace-545sv\" (UID: \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\") " pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.816564 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00fdfe8-7ba5-419c-8329-78f8fae238c5-utilities\") pod \"redhat-marketplace-545sv\" (UID: \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\") " pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.816794 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00fdfe8-7ba5-419c-8329-78f8fae238c5-catalog-content\") pod \"redhat-marketplace-545sv\" (UID: \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\") " pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.840064 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b89gm\" (UniqueName: \"kubernetes.io/projected/e00fdfe8-7ba5-419c-8329-78f8fae238c5-kube-api-access-b89gm\") pod \"redhat-marketplace-545sv\" (UID: \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\") " pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:00 crc kubenswrapper[4792]: I0309 09:43:00.862002 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:01 crc kubenswrapper[4792]: I0309 09:43:01.396806 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-545sv"] Mar 09 09:43:02 crc kubenswrapper[4792]: I0309 09:43:02.171963 4792 generic.go:334] "Generic (PLEG): container finished" podID="e00fdfe8-7ba5-419c-8329-78f8fae238c5" containerID="7f2ae84ce44039aec72a9c2b2559e73622ec5cbed48f9df3a63e18865377f986" exitCode=0 Mar 09 09:43:02 crc kubenswrapper[4792]: I0309 09:43:02.172261 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-545sv" event={"ID":"e00fdfe8-7ba5-419c-8329-78f8fae238c5","Type":"ContainerDied","Data":"7f2ae84ce44039aec72a9c2b2559e73622ec5cbed48f9df3a63e18865377f986"} Mar 09 09:43:02 crc kubenswrapper[4792]: I0309 09:43:02.172285 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-545sv" event={"ID":"e00fdfe8-7ba5-419c-8329-78f8fae238c5","Type":"ContainerStarted","Data":"74eda2ac33ee85a6a1656b7f5c2768dd64c14796413452f397adcd5de3b8c1d6"} Mar 09 09:43:03 crc kubenswrapper[4792]: I0309 09:43:03.181201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-545sv" event={"ID":"e00fdfe8-7ba5-419c-8329-78f8fae238c5","Type":"ContainerStarted","Data":"d91601cde8ce064f094f3666b10cae048a8937784bde8dd5f3441a7708df5426"} Mar 09 09:43:04 crc kubenswrapper[4792]: I0309 09:43:04.190001 4792 generic.go:334] "Generic (PLEG): container finished" podID="e00fdfe8-7ba5-419c-8329-78f8fae238c5" containerID="d91601cde8ce064f094f3666b10cae048a8937784bde8dd5f3441a7708df5426" exitCode=0 Mar 09 09:43:04 crc kubenswrapper[4792]: I0309 09:43:04.190104 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-545sv" event={"ID":"e00fdfe8-7ba5-419c-8329-78f8fae238c5","Type":"ContainerDied","Data":"d91601cde8ce064f094f3666b10cae048a8937784bde8dd5f3441a7708df5426"} Mar 09 09:43:05 crc kubenswrapper[4792]: I0309 09:43:05.201358 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-545sv" event={"ID":"e00fdfe8-7ba5-419c-8329-78f8fae238c5","Type":"ContainerStarted","Data":"9764e1f8c36c0573587cc7eeb437447b9ae57d625672ccd242fcfed7aa524442"} Mar 09 09:43:05 crc kubenswrapper[4792]: I0309 09:43:05.228269 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-545sv" podStartSLOduration=2.781565094 podStartE2EDuration="5.228245676s" podCreationTimestamp="2026-03-09 09:43:00 +0000 UTC" firstStartedPulling="2026-03-09 09:43:02.173825027 +0000 UTC m=+2147.204025779" lastFinishedPulling="2026-03-09 09:43:04.620505609 +0000 UTC m=+2149.650706361" observedRunningTime="2026-03-09 09:43:05.220845428 +0000 UTC m=+2150.251046200" watchObservedRunningTime="2026-03-09 09:43:05.228245676 +0000 UTC m=+2150.258446418" Mar 09 09:43:10 crc kubenswrapper[4792]: I0309 09:43:10.863021 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:10 crc kubenswrapper[4792]: I0309 09:43:10.864103 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:10 crc kubenswrapper[4792]: I0309 09:43:10.913952 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:11 crc kubenswrapper[4792]: I0309 09:43:11.314610 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:11 crc kubenswrapper[4792]: I0309 09:43:11.492793 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-545sv"] Mar 09 09:43:13 crc kubenswrapper[4792]: I0309 09:43:13.291407 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-545sv" podUID="e00fdfe8-7ba5-419c-8329-78f8fae238c5" containerName="registry-server" containerID="cri-o://9764e1f8c36c0573587cc7eeb437447b9ae57d625672ccd242fcfed7aa524442" gracePeriod=2 Mar 09 09:43:13 crc kubenswrapper[4792]: I0309 09:43:13.755036 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:13 crc kubenswrapper[4792]: I0309 09:43:13.857964 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00fdfe8-7ba5-419c-8329-78f8fae238c5-catalog-content\") pod \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\" (UID: \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\") " Mar 09 09:43:13 crc kubenswrapper[4792]: I0309 09:43:13.858029 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00fdfe8-7ba5-419c-8329-78f8fae238c5-utilities\") pod \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\" (UID: \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\") " Mar 09 09:43:13 crc kubenswrapper[4792]: I0309 09:43:13.858122 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b89gm\" (UniqueName: \"kubernetes.io/projected/e00fdfe8-7ba5-419c-8329-78f8fae238c5-kube-api-access-b89gm\") pod \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\" (UID: \"e00fdfe8-7ba5-419c-8329-78f8fae238c5\") " Mar 09 09:43:13 crc kubenswrapper[4792]: I0309 09:43:13.859134 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e00fdfe8-7ba5-419c-8329-78f8fae238c5-utilities" (OuterVolumeSpecName: "utilities") pod "e00fdfe8-7ba5-419c-8329-78f8fae238c5" (UID: "e00fdfe8-7ba5-419c-8329-78f8fae238c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:43:13 crc kubenswrapper[4792]: I0309 09:43:13.866271 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00fdfe8-7ba5-419c-8329-78f8fae238c5-kube-api-access-b89gm" (OuterVolumeSpecName: "kube-api-access-b89gm") pod "e00fdfe8-7ba5-419c-8329-78f8fae238c5" (UID: "e00fdfe8-7ba5-419c-8329-78f8fae238c5"). InnerVolumeSpecName "kube-api-access-b89gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:43:13 crc kubenswrapper[4792]: I0309 09:43:13.885000 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e00fdfe8-7ba5-419c-8329-78f8fae238c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e00fdfe8-7ba5-419c-8329-78f8fae238c5" (UID: "e00fdfe8-7ba5-419c-8329-78f8fae238c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:43:13 crc kubenswrapper[4792]: I0309 09:43:13.960361 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00fdfe8-7ba5-419c-8329-78f8fae238c5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:43:13 crc kubenswrapper[4792]: I0309 09:43:13.960399 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00fdfe8-7ba5-419c-8329-78f8fae238c5-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:43:13 crc kubenswrapper[4792]: I0309 09:43:13.960411 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b89gm\" (UniqueName: \"kubernetes.io/projected/e00fdfe8-7ba5-419c-8329-78f8fae238c5-kube-api-access-b89gm\") on node \"crc\" DevicePath \"\"" Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.311597 4792 generic.go:334] "Generic (PLEG): container finished" podID="e00fdfe8-7ba5-419c-8329-78f8fae238c5" containerID="9764e1f8c36c0573587cc7eeb437447b9ae57d625672ccd242fcfed7aa524442" exitCode=0 Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.311652 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-545sv" event={"ID":"e00fdfe8-7ba5-419c-8329-78f8fae238c5","Type":"ContainerDied","Data":"9764e1f8c36c0573587cc7eeb437447b9ae57d625672ccd242fcfed7aa524442"} Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.311662 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-545sv" Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.311685 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-545sv" event={"ID":"e00fdfe8-7ba5-419c-8329-78f8fae238c5","Type":"ContainerDied","Data":"74eda2ac33ee85a6a1656b7f5c2768dd64c14796413452f397adcd5de3b8c1d6"} Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.311712 4792 scope.go:117] "RemoveContainer" containerID="9764e1f8c36c0573587cc7eeb437447b9ae57d625672ccd242fcfed7aa524442" Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.348845 4792 scope.go:117] "RemoveContainer" containerID="d91601cde8ce064f094f3666b10cae048a8937784bde8dd5f3441a7708df5426" Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.353455 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-545sv"] Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.363087 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-545sv"] Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.384382 4792 scope.go:117] "RemoveContainer" containerID="7f2ae84ce44039aec72a9c2b2559e73622ec5cbed48f9df3a63e18865377f986" Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.414878 4792 scope.go:117] "RemoveContainer" containerID="9764e1f8c36c0573587cc7eeb437447b9ae57d625672ccd242fcfed7aa524442" Mar 09 09:43:14 crc kubenswrapper[4792]: E0309 09:43:14.415394 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9764e1f8c36c0573587cc7eeb437447b9ae57d625672ccd242fcfed7aa524442\": container with ID starting with 9764e1f8c36c0573587cc7eeb437447b9ae57d625672ccd242fcfed7aa524442 not found: ID does not exist" containerID="9764e1f8c36c0573587cc7eeb437447b9ae57d625672ccd242fcfed7aa524442" Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.415453 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9764e1f8c36c0573587cc7eeb437447b9ae57d625672ccd242fcfed7aa524442"} err="failed to get container status \"9764e1f8c36c0573587cc7eeb437447b9ae57d625672ccd242fcfed7aa524442\": rpc error: code = NotFound desc = could not find container \"9764e1f8c36c0573587cc7eeb437447b9ae57d625672ccd242fcfed7aa524442\": container with ID starting with 9764e1f8c36c0573587cc7eeb437447b9ae57d625672ccd242fcfed7aa524442 not found: ID does not exist" Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.415486 4792 scope.go:117] "RemoveContainer" containerID="d91601cde8ce064f094f3666b10cae048a8937784bde8dd5f3441a7708df5426" Mar 09 09:43:14 crc kubenswrapper[4792]: E0309 09:43:14.415797 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d91601cde8ce064f094f3666b10cae048a8937784bde8dd5f3441a7708df5426\": container with ID starting with d91601cde8ce064f094f3666b10cae048a8937784bde8dd5f3441a7708df5426 not found: ID does not exist" containerID="d91601cde8ce064f094f3666b10cae048a8937784bde8dd5f3441a7708df5426" Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.415822 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d91601cde8ce064f094f3666b10cae048a8937784bde8dd5f3441a7708df5426"} err="failed to get container status \"d91601cde8ce064f094f3666b10cae048a8937784bde8dd5f3441a7708df5426\": rpc error: code = NotFound desc = could not find container \"d91601cde8ce064f094f3666b10cae048a8937784bde8dd5f3441a7708df5426\": container with ID starting with d91601cde8ce064f094f3666b10cae048a8937784bde8dd5f3441a7708df5426 not found: ID does not exist" Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.415846 4792 scope.go:117] "RemoveContainer" containerID="7f2ae84ce44039aec72a9c2b2559e73622ec5cbed48f9df3a63e18865377f986" Mar 09 09:43:14 crc kubenswrapper[4792]: E0309 09:43:14.416235 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2ae84ce44039aec72a9c2b2559e73622ec5cbed48f9df3a63e18865377f986\": container with ID starting with 7f2ae84ce44039aec72a9c2b2559e73622ec5cbed48f9df3a63e18865377f986 not found: ID does not exist" containerID="7f2ae84ce44039aec72a9c2b2559e73622ec5cbed48f9df3a63e18865377f986" Mar 09 09:43:14 crc kubenswrapper[4792]: I0309 09:43:14.416265 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2ae84ce44039aec72a9c2b2559e73622ec5cbed48f9df3a63e18865377f986"} err="failed to get container status \"7f2ae84ce44039aec72a9c2b2559e73622ec5cbed48f9df3a63e18865377f986\": rpc error: code = NotFound desc = could not find container \"7f2ae84ce44039aec72a9c2b2559e73622ec5cbed48f9df3a63e18865377f986\": container with ID starting with 7f2ae84ce44039aec72a9c2b2559e73622ec5cbed48f9df3a63e18865377f986 not found: ID does not exist" Mar 09 09:43:15 crc kubenswrapper[4792]: I0309 09:43:15.674094 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00fdfe8-7ba5-419c-8329-78f8fae238c5" path="/var/lib/kubelet/pods/e00fdfe8-7ba5-419c-8329-78f8fae238c5/volumes" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.146418 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550824-wdgmk"] Mar 09 09:44:00 crc kubenswrapper[4792]: E0309 09:44:00.147352 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00fdfe8-7ba5-419c-8329-78f8fae238c5" containerName="extract-content" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.147368 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00fdfe8-7ba5-419c-8329-78f8fae238c5" containerName="extract-content" Mar 09 09:44:00 crc kubenswrapper[4792]: E0309 09:44:00.147388 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00fdfe8-7ba5-419c-8329-78f8fae238c5" containerName="registry-server" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.147396 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00fdfe8-7ba5-419c-8329-78f8fae238c5" containerName="registry-server" Mar 09 09:44:00 crc kubenswrapper[4792]: E0309 09:44:00.147415 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00fdfe8-7ba5-419c-8329-78f8fae238c5" containerName="extract-utilities" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.147423 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00fdfe8-7ba5-419c-8329-78f8fae238c5" containerName="extract-utilities" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.147655 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00fdfe8-7ba5-419c-8329-78f8fae238c5" containerName="registry-server" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.148342 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550824-wdgmk" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.151022 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.151699 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.152449 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.179356 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550824-wdgmk"] Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.345634 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4rtw\" (UniqueName: \"kubernetes.io/projected/e0cb70dd-e163-45b2-958a-6976abab327b-kube-api-access-l4rtw\") pod \"auto-csr-approver-29550824-wdgmk\" (UID: \"e0cb70dd-e163-45b2-958a-6976abab327b\") " pod="openshift-infra/auto-csr-approver-29550824-wdgmk" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.446612 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4rtw\" (UniqueName: \"kubernetes.io/projected/e0cb70dd-e163-45b2-958a-6976abab327b-kube-api-access-l4rtw\") pod \"auto-csr-approver-29550824-wdgmk\" (UID: \"e0cb70dd-e163-45b2-958a-6976abab327b\") " pod="openshift-infra/auto-csr-approver-29550824-wdgmk" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.473820 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4rtw\" (UniqueName: \"kubernetes.io/projected/e0cb70dd-e163-45b2-958a-6976abab327b-kube-api-access-l4rtw\") pod \"auto-csr-approver-29550824-wdgmk\" (UID: \"e0cb70dd-e163-45b2-958a-6976abab327b\") " pod="openshift-infra/auto-csr-approver-29550824-wdgmk" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.485547 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550824-wdgmk" Mar 09 09:44:00 crc kubenswrapper[4792]: I0309 09:44:00.979812 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550824-wdgmk"] Mar 09 09:44:01 crc kubenswrapper[4792]: I0309 09:44:01.713586 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550824-wdgmk" event={"ID":"e0cb70dd-e163-45b2-958a-6976abab327b","Type":"ContainerStarted","Data":"47743b0b362f7c286fc45e5e380a2f9e4349d194edbf12feb3b705ae378814c4"} Mar 09 09:44:02 crc kubenswrapper[4792]: I0309 09:44:02.722390 4792 generic.go:334] "Generic (PLEG): container finished" podID="e0cb70dd-e163-45b2-958a-6976abab327b" containerID="1076a39049e092b567394b27ac970575a352fa705c4b4b1f5ff883dc912e1025" exitCode=0 Mar 09 09:44:02 crc kubenswrapper[4792]: I0309 09:44:02.722433 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550824-wdgmk" event={"ID":"e0cb70dd-e163-45b2-958a-6976abab327b","Type":"ContainerDied","Data":"1076a39049e092b567394b27ac970575a352fa705c4b4b1f5ff883dc912e1025"} Mar 09 09:44:04 crc kubenswrapper[4792]: I0309 09:44:04.088116 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550824-wdgmk" Mar 09 09:44:04 crc kubenswrapper[4792]: I0309 09:44:04.212709 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4rtw\" (UniqueName: \"kubernetes.io/projected/e0cb70dd-e163-45b2-958a-6976abab327b-kube-api-access-l4rtw\") pod \"e0cb70dd-e163-45b2-958a-6976abab327b\" (UID: \"e0cb70dd-e163-45b2-958a-6976abab327b\") " Mar 09 09:44:04 crc kubenswrapper[4792]: I0309 09:44:04.219357 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0cb70dd-e163-45b2-958a-6976abab327b-kube-api-access-l4rtw" (OuterVolumeSpecName: "kube-api-access-l4rtw") pod "e0cb70dd-e163-45b2-958a-6976abab327b" (UID: "e0cb70dd-e163-45b2-958a-6976abab327b"). InnerVolumeSpecName "kube-api-access-l4rtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:44:04 crc kubenswrapper[4792]: I0309 09:44:04.315584 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4rtw\" (UniqueName: \"kubernetes.io/projected/e0cb70dd-e163-45b2-958a-6976abab327b-kube-api-access-l4rtw\") on node \"crc\" DevicePath \"\"" Mar 09 09:44:04 crc kubenswrapper[4792]: I0309 09:44:04.739598 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550824-wdgmk" event={"ID":"e0cb70dd-e163-45b2-958a-6976abab327b","Type":"ContainerDied","Data":"47743b0b362f7c286fc45e5e380a2f9e4349d194edbf12feb3b705ae378814c4"} Mar 09 09:44:04 crc kubenswrapper[4792]: I0309 09:44:04.739874 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47743b0b362f7c286fc45e5e380a2f9e4349d194edbf12feb3b705ae378814c4" Mar 09 09:44:04 crc kubenswrapper[4792]: I0309 09:44:04.739640 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550824-wdgmk" Mar 09 09:44:05 crc kubenswrapper[4792]: I0309 09:44:05.203713 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550818-jtvhj"] Mar 09 09:44:05 crc kubenswrapper[4792]: I0309 09:44:05.210594 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550818-jtvhj"] Mar 09 09:44:05 crc kubenswrapper[4792]: I0309 09:44:05.673352 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42fd3efa-7e1b-4ff6-a3c6-9947f805675c" path="/var/lib/kubelet/pods/42fd3efa-7e1b-4ff6-a3c6-9947f805675c/volumes" Mar 09 09:44:13 crc kubenswrapper[4792]: I0309 09:44:13.220580 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:44:13 crc kubenswrapper[4792]: I0309 09:44:13.221196 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.317698 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q2dh9"] Mar 09 09:44:22 crc kubenswrapper[4792]: E0309 09:44:22.318710 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0cb70dd-e163-45b2-958a-6976abab327b" containerName="oc" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.318726 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0cb70dd-e163-45b2-958a-6976abab327b" containerName="oc" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.318928 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0cb70dd-e163-45b2-958a-6976abab327b" containerName="oc" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.320124 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.342740 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2dh9"] Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.456591 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmgzr\" (UniqueName: \"kubernetes.io/projected/0930c22c-ea43-4769-9ab1-231efd72eb68-kube-api-access-fmgzr\") pod \"community-operators-q2dh9\" (UID: \"0930c22c-ea43-4769-9ab1-231efd72eb68\") " pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.456647 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930c22c-ea43-4769-9ab1-231efd72eb68-catalog-content\") pod \"community-operators-q2dh9\" (UID: \"0930c22c-ea43-4769-9ab1-231efd72eb68\") " pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.456688 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930c22c-ea43-4769-9ab1-231efd72eb68-utilities\") pod \"community-operators-q2dh9\" (UID: \"0930c22c-ea43-4769-9ab1-231efd72eb68\") " pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.558727 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930c22c-ea43-4769-9ab1-231efd72eb68-utilities\") pod \"community-operators-q2dh9\" (UID: \"0930c22c-ea43-4769-9ab1-231efd72eb68\") " pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.558858 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmgzr\" (UniqueName: \"kubernetes.io/projected/0930c22c-ea43-4769-9ab1-231efd72eb68-kube-api-access-fmgzr\") pod \"community-operators-q2dh9\" (UID: \"0930c22c-ea43-4769-9ab1-231efd72eb68\") " pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.558895 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930c22c-ea43-4769-9ab1-231efd72eb68-catalog-content\") pod \"community-operators-q2dh9\" (UID: \"0930c22c-ea43-4769-9ab1-231efd72eb68\") " pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.559303 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930c22c-ea43-4769-9ab1-231efd72eb68-catalog-content\") pod \"community-operators-q2dh9\" (UID: \"0930c22c-ea43-4769-9ab1-231efd72eb68\") " pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.559659 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930c22c-ea43-4769-9ab1-231efd72eb68-utilities\") pod \"community-operators-q2dh9\" (UID: \"0930c22c-ea43-4769-9ab1-231efd72eb68\") " pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.584611 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmgzr\" (UniqueName: \"kubernetes.io/projected/0930c22c-ea43-4769-9ab1-231efd72eb68-kube-api-access-fmgzr\") pod \"community-operators-q2dh9\" (UID: \"0930c22c-ea43-4769-9ab1-231efd72eb68\") " pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:22 crc kubenswrapper[4792]: I0309 09:44:22.657371 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:23 crc kubenswrapper[4792]: I0309 09:44:23.324884 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2dh9"] Mar 09 09:44:23 crc kubenswrapper[4792]: I0309 09:44:23.909909 4792 generic.go:334] "Generic (PLEG): container finished" podID="0930c22c-ea43-4769-9ab1-231efd72eb68" containerID="473da03fe00d5a2e327f5004b8336ffd125cfe3cb6934522c880d0530943cb4d" exitCode=0 Mar 09 09:44:23 crc kubenswrapper[4792]: I0309 09:44:23.910198 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2dh9" event={"ID":"0930c22c-ea43-4769-9ab1-231efd72eb68","Type":"ContainerDied","Data":"473da03fe00d5a2e327f5004b8336ffd125cfe3cb6934522c880d0530943cb4d"} Mar 09 09:44:23 crc kubenswrapper[4792]: I0309 09:44:23.910258 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2dh9" event={"ID":"0930c22c-ea43-4769-9ab1-231efd72eb68","Type":"ContainerStarted","Data":"66f7e4dec0283bdbc2c8976526d265ba25a93069e5a1c910949296a8fb2d331a"} Mar 09 09:44:24 crc kubenswrapper[4792]: I0309 09:44:24.920183 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2dh9" event={"ID":"0930c22c-ea43-4769-9ab1-231efd72eb68","Type":"ContainerStarted","Data":"47f4815466706a5a5ad50b086eb590fffd4278b21cf495c4a3cb7acaa7d4c290"} Mar 09 09:44:26 crc kubenswrapper[4792]: I0309 09:44:26.936459 4792 generic.go:334] "Generic (PLEG): container finished" podID="0930c22c-ea43-4769-9ab1-231efd72eb68" containerID="47f4815466706a5a5ad50b086eb590fffd4278b21cf495c4a3cb7acaa7d4c290" exitCode=0 Mar 09 09:44:26 crc kubenswrapper[4792]: I0309 09:44:26.936549 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2dh9" event={"ID":"0930c22c-ea43-4769-9ab1-231efd72eb68","Type":"ContainerDied","Data":"47f4815466706a5a5ad50b086eb590fffd4278b21cf495c4a3cb7acaa7d4c290"} Mar 09 09:44:27 crc kubenswrapper[4792]: I0309 09:44:27.948148 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2dh9" event={"ID":"0930c22c-ea43-4769-9ab1-231efd72eb68","Type":"ContainerStarted","Data":"c861baa7768caa589cdd46574cd7d32b43a53b62373a4871f721f88ddfe9cf3e"} Mar 09 09:44:27 crc kubenswrapper[4792]: I0309 09:44:27.978808 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q2dh9" podStartSLOduration=2.557645901 podStartE2EDuration="5.978787986s" podCreationTimestamp="2026-03-09 09:44:22 +0000 UTC" firstStartedPulling="2026-03-09 09:44:23.912127647 +0000 UTC m=+2228.942328399" lastFinishedPulling="2026-03-09 09:44:27.333269732 +0000 UTC m=+2232.363470484" observedRunningTime="2026-03-09 09:44:27.968193669 +0000 UTC m=+2232.998394451" watchObservedRunningTime="2026-03-09 09:44:27.978787986 +0000 UTC m=+2233.008988748" Mar 09 09:44:32 crc kubenswrapper[4792]: I0309 09:44:32.657522 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:32 crc kubenswrapper[4792]: I0309 09:44:32.658273 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:32 crc kubenswrapper[4792]: I0309 09:44:32.712030 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:33 crc kubenswrapper[4792]: I0309 09:44:33.053771 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:33 crc kubenswrapper[4792]: I0309 09:44:33.111290 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2dh9"] Mar 09 09:44:35 crc kubenswrapper[4792]: I0309 09:44:35.011585 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q2dh9" podUID="0930c22c-ea43-4769-9ab1-231efd72eb68" containerName="registry-server" containerID="cri-o://c861baa7768caa589cdd46574cd7d32b43a53b62373a4871f721f88ddfe9cf3e" gracePeriod=2 Mar 09 09:44:35 crc kubenswrapper[4792]: I0309 09:44:35.446663 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:35 crc kubenswrapper[4792]: I0309 09:44:35.521400 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930c22c-ea43-4769-9ab1-231efd72eb68-utilities\") pod \"0930c22c-ea43-4769-9ab1-231efd72eb68\" (UID: \"0930c22c-ea43-4769-9ab1-231efd72eb68\") " Mar 09 09:44:35 crc kubenswrapper[4792]: I0309 09:44:35.521459 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930c22c-ea43-4769-9ab1-231efd72eb68-catalog-content\") pod \"0930c22c-ea43-4769-9ab1-231efd72eb68\" (UID: \"0930c22c-ea43-4769-9ab1-231efd72eb68\") " Mar 09 09:44:35 crc kubenswrapper[4792]: I0309 09:44:35.521550 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmgzr\" (UniqueName: \"kubernetes.io/projected/0930c22c-ea43-4769-9ab1-231efd72eb68-kube-api-access-fmgzr\") pod \"0930c22c-ea43-4769-9ab1-231efd72eb68\" (UID: \"0930c22c-ea43-4769-9ab1-231efd72eb68\") " Mar 09 09:44:35 crc kubenswrapper[4792]: I0309 09:44:35.524024 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0930c22c-ea43-4769-9ab1-231efd72eb68-utilities" (OuterVolumeSpecName: "utilities") pod "0930c22c-ea43-4769-9ab1-231efd72eb68" (UID: "0930c22c-ea43-4769-9ab1-231efd72eb68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:44:35 crc kubenswrapper[4792]: I0309 09:44:35.542894 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0930c22c-ea43-4769-9ab1-231efd72eb68-kube-api-access-fmgzr" (OuterVolumeSpecName: "kube-api-access-fmgzr") pod "0930c22c-ea43-4769-9ab1-231efd72eb68" (UID: "0930c22c-ea43-4769-9ab1-231efd72eb68"). InnerVolumeSpecName "kube-api-access-fmgzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:44:35 crc kubenswrapper[4792]: I0309 09:44:35.583179 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0930c22c-ea43-4769-9ab1-231efd72eb68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0930c22c-ea43-4769-9ab1-231efd72eb68" (UID: "0930c22c-ea43-4769-9ab1-231efd72eb68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:44:35 crc kubenswrapper[4792]: I0309 09:44:35.623779 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930c22c-ea43-4769-9ab1-231efd72eb68-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:44:35 crc kubenswrapper[4792]: I0309 09:44:35.623829 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930c22c-ea43-4769-9ab1-231efd72eb68-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:44:35 crc kubenswrapper[4792]: I0309 09:44:35.623846 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmgzr\" (UniqueName: \"kubernetes.io/projected/0930c22c-ea43-4769-9ab1-231efd72eb68-kube-api-access-fmgzr\") on node \"crc\" DevicePath \"\"" Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.020609 4792 generic.go:334] "Generic (PLEG): container finished" podID="0930c22c-ea43-4769-9ab1-231efd72eb68" containerID="c861baa7768caa589cdd46574cd7d32b43a53b62373a4871f721f88ddfe9cf3e" exitCode=0 Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.020688 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2dh9" event={"ID":"0930c22c-ea43-4769-9ab1-231efd72eb68","Type":"ContainerDied","Data":"c861baa7768caa589cdd46574cd7d32b43a53b62373a4871f721f88ddfe9cf3e"} Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.020948 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2dh9" event={"ID":"0930c22c-ea43-4769-9ab1-231efd72eb68","Type":"ContainerDied","Data":"66f7e4dec0283bdbc2c8976526d265ba25a93069e5a1c910949296a8fb2d331a"} Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.020972 4792 scope.go:117] "RemoveContainer" containerID="c861baa7768caa589cdd46574cd7d32b43a53b62373a4871f721f88ddfe9cf3e" Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.020738 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2dh9" Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.051972 4792 scope.go:117] "RemoveContainer" containerID="47f4815466706a5a5ad50b086eb590fffd4278b21cf495c4a3cb7acaa7d4c290" Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.052317 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2dh9"] Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.064624 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q2dh9"] Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.080597 4792 scope.go:117] "RemoveContainer" containerID="473da03fe00d5a2e327f5004b8336ffd125cfe3cb6934522c880d0530943cb4d" Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.110754 4792 scope.go:117] "RemoveContainer" containerID="c861baa7768caa589cdd46574cd7d32b43a53b62373a4871f721f88ddfe9cf3e" Mar 09 09:44:36 crc kubenswrapper[4792]: E0309 09:44:36.111479 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c861baa7768caa589cdd46574cd7d32b43a53b62373a4871f721f88ddfe9cf3e\": container with ID starting with c861baa7768caa589cdd46574cd7d32b43a53b62373a4871f721f88ddfe9cf3e not found: ID does not exist" containerID="c861baa7768caa589cdd46574cd7d32b43a53b62373a4871f721f88ddfe9cf3e" Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.111521 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c861baa7768caa589cdd46574cd7d32b43a53b62373a4871f721f88ddfe9cf3e"} err="failed to get container status \"c861baa7768caa589cdd46574cd7d32b43a53b62373a4871f721f88ddfe9cf3e\": rpc error: code = NotFound desc = could not find container \"c861baa7768caa589cdd46574cd7d32b43a53b62373a4871f721f88ddfe9cf3e\": container with ID starting with c861baa7768caa589cdd46574cd7d32b43a53b62373a4871f721f88ddfe9cf3e not found: ID does not exist" Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.111548 4792 scope.go:117] "RemoveContainer" containerID="47f4815466706a5a5ad50b086eb590fffd4278b21cf495c4a3cb7acaa7d4c290" Mar 09 09:44:36 crc kubenswrapper[4792]: E0309 09:44:36.112178 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f4815466706a5a5ad50b086eb590fffd4278b21cf495c4a3cb7acaa7d4c290\": container with ID starting with 47f4815466706a5a5ad50b086eb590fffd4278b21cf495c4a3cb7acaa7d4c290 not found: ID does not exist" containerID="47f4815466706a5a5ad50b086eb590fffd4278b21cf495c4a3cb7acaa7d4c290" Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.112211 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f4815466706a5a5ad50b086eb590fffd4278b21cf495c4a3cb7acaa7d4c290"} err="failed to get container status \"47f4815466706a5a5ad50b086eb590fffd4278b21cf495c4a3cb7acaa7d4c290\": rpc error: code = NotFound desc = could not find container \"47f4815466706a5a5ad50b086eb590fffd4278b21cf495c4a3cb7acaa7d4c290\": container with ID starting with 47f4815466706a5a5ad50b086eb590fffd4278b21cf495c4a3cb7acaa7d4c290 not found: ID does not exist" Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.112236 4792 scope.go:117] "RemoveContainer" containerID="473da03fe00d5a2e327f5004b8336ffd125cfe3cb6934522c880d0530943cb4d" Mar 09 09:44:36 crc kubenswrapper[4792]: E0309 09:44:36.112538 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"473da03fe00d5a2e327f5004b8336ffd125cfe3cb6934522c880d0530943cb4d\": container with ID starting with 473da03fe00d5a2e327f5004b8336ffd125cfe3cb6934522c880d0530943cb4d not found: ID does not exist" containerID="473da03fe00d5a2e327f5004b8336ffd125cfe3cb6934522c880d0530943cb4d" Mar 09 09:44:36 crc kubenswrapper[4792]: I0309 09:44:36.112562 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"473da03fe00d5a2e327f5004b8336ffd125cfe3cb6934522c880d0530943cb4d"} err="failed to get container status \"473da03fe00d5a2e327f5004b8336ffd125cfe3cb6934522c880d0530943cb4d\": rpc error: code = NotFound desc = could not find container \"473da03fe00d5a2e327f5004b8336ffd125cfe3cb6934522c880d0530943cb4d\": container with ID starting with 473da03fe00d5a2e327f5004b8336ffd125cfe3cb6934522c880d0530943cb4d not found: ID does not exist" Mar 09 09:44:37 crc kubenswrapper[4792]: I0309 09:44:37.673171 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0930c22c-ea43-4769-9ab1-231efd72eb68" path="/var/lib/kubelet/pods/0930c22c-ea43-4769-9ab1-231efd72eb68/volumes" Mar 09 09:44:38 crc kubenswrapper[4792]: I0309 09:44:38.895722 4792 scope.go:117] "RemoveContainer" containerID="bb5c7c41e531ab9fb4118589739801dfee310d40a7851a9f88de1b9d86798ab4" Mar 09 09:44:43 crc kubenswrapper[4792]: I0309 09:44:43.214192 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:44:43 crc kubenswrapper[4792]: I0309 09:44:43.214774 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.084756 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.101915 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vqt6h"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.169231 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4qk8n"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.180236 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.192648 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.200632 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gbs7v"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.209559 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.217545 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.224180 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4qk8n"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.231442 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.242994 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lx47x"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.251144 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-h7rdt"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.260169 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.265150 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7mbm"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.274430 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pd6cm"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.284164 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-7jfkr"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.294607 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.304180 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-2tj6r"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.314284 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.319527 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z8wb9"] Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.677848 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c47d4d4-f315-4f19-8d62-dbc17c64a39a" path="/var/lib/kubelet/pods/1c47d4d4-f315-4f19-8d62-dbc17c64a39a/volumes" Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.678617 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2acce3b0-ccfb-48f6-af71-ecaa5b820874" path="/var/lib/kubelet/pods/2acce3b0-ccfb-48f6-af71-ecaa5b820874/volumes" Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.679190 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30cac2f1-8f76-42a8-a5dd-e828ed1d7430" path="/var/lib/kubelet/pods/30cac2f1-8f76-42a8-a5dd-e828ed1d7430/volumes" Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.679725 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3169dde4-b88a-4a42-b22c-452ed9d2b945" path="/var/lib/kubelet/pods/3169dde4-b88a-4a42-b22c-452ed9d2b945/volumes" Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.680884 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3791a8-646a-4beb-a1b5-fa3390e5a9c9" path="/var/lib/kubelet/pods/7e3791a8-646a-4beb-a1b5-fa3390e5a9c9/volumes" Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.681497 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9659141d-9a06-4963-b1ec-5982b06ade1c" path="/var/lib/kubelet/pods/9659141d-9a06-4963-b1ec-5982b06ade1c/volumes" Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.682084 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b16ebd2e-e43b-40cb-9b63-76707c9a25d6" path="/var/lib/kubelet/pods/b16ebd2e-e43b-40cb-9b63-76707c9a25d6/volumes" Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.683031 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9bb4ffd-65c0-45ae-b2b8-7f8c718500de" path="/var/lib/kubelet/pods/b9bb4ffd-65c0-45ae-b2b8-7f8c718500de/volumes" Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.683680 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c745abb2-062a-4a6f-9360-e06bf80639d0" path="/var/lib/kubelet/pods/c745abb2-062a-4a6f-9360-e06bf80639d0/volumes" Mar 09 09:44:45 crc kubenswrapper[4792]: I0309 09:44:45.684325 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcfea667-10ed-44bd-8bf0-c41767b68d61" path="/var/lib/kubelet/pods/dcfea667-10ed-44bd-8bf0-c41767b68d61/volumes" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.168776 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l"] Mar 09 09:44:58 crc kubenswrapper[4792]: E0309 09:44:58.169627 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0930c22c-ea43-4769-9ab1-231efd72eb68" containerName="extract-content" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.169640 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0930c22c-ea43-4769-9ab1-231efd72eb68" containerName="extract-content" Mar 09 09:44:58 crc kubenswrapper[4792]: E0309 09:44:58.169658 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0930c22c-ea43-4769-9ab1-231efd72eb68" containerName="registry-server" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.169664 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0930c22c-ea43-4769-9ab1-231efd72eb68" containerName="registry-server" Mar 09 09:44:58 crc kubenswrapper[4792]: E0309 09:44:58.169676 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0930c22c-ea43-4769-9ab1-231efd72eb68" containerName="extract-utilities" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.169682 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0930c22c-ea43-4769-9ab1-231efd72eb68" containerName="extract-utilities" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.169840 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0930c22c-ea43-4769-9ab1-231efd72eb68" containerName="registry-server" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.172208 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.175154 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.175254 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.175343 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.175349 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.182925 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.185631 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l"] Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.309692 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.309775 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.309811 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.309869 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2jtv\" (UniqueName: \"kubernetes.io/projected/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-kube-api-access-v2jtv\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.309940 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.411603 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jtv\" (UniqueName: \"kubernetes.io/projected/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-kube-api-access-v2jtv\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.411764 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.411861 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.411927 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.411964 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.417641 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.417823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.418158 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.419600 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.430892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2jtv\" (UniqueName: \"kubernetes.io/projected/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-kube-api-access-v2jtv\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:58 crc kubenswrapper[4792]: I0309 09:44:58.493890 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:44:59 crc kubenswrapper[4792]: I0309 09:44:59.002060 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l"] Mar 09 09:44:59 crc kubenswrapper[4792]: I0309 09:44:59.202090 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" event={"ID":"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a","Type":"ContainerStarted","Data":"b224047b930d4997375d35b53f5a3201325a6c46e54da9e55f5de91ad2d4ca75"} Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.131282 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k"] Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.132571 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.136493 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.136544 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.155791 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k"] Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.212996 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" event={"ID":"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a","Type":"ContainerStarted","Data":"b55d1d4d92cc0fa449923074665ecd59b184346d06f8970ac5bea0fc63bb74b2"} Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.236713 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" podStartSLOduration=1.374957614 podStartE2EDuration="2.236691029s" podCreationTimestamp="2026-03-09 09:44:58 +0000 UTC" firstStartedPulling="2026-03-09 09:44:59.006663594 +0000 UTC m=+2264.036864346" lastFinishedPulling="2026-03-09 09:44:59.868397019 +0000 UTC m=+2264.898597761" observedRunningTime="2026-03-09 09:45:00.229485705 +0000 UTC m=+2265.259686467" watchObservedRunningTime="2026-03-09 09:45:00.236691029 +0000 UTC m=+2265.266891781" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.250636 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-secret-volume\") pod \"collect-profiles-29550825-c687k\" (UID: \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.250696 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggjtf\" (UniqueName: \"kubernetes.io/projected/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-kube-api-access-ggjtf\") pod \"collect-profiles-29550825-c687k\" (UID: \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.250795 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-config-volume\") pod \"collect-profiles-29550825-c687k\" (UID: \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.352943 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-secret-volume\") pod \"collect-profiles-29550825-c687k\" (UID: \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.353313 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggjtf\" (UniqueName: \"kubernetes.io/projected/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-kube-api-access-ggjtf\") pod \"collect-profiles-29550825-c687k\" (UID: \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.353384 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-config-volume\") pod \"collect-profiles-29550825-c687k\" (UID: \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.355628 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-config-volume\") pod \"collect-profiles-29550825-c687k\" (UID: \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.360764 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-secret-volume\") pod \"collect-profiles-29550825-c687k\" (UID: \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.376841 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggjtf\" (UniqueName: \"kubernetes.io/projected/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-kube-api-access-ggjtf\") pod \"collect-profiles-29550825-c687k\" (UID: \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.488211 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" Mar 09 09:45:00 crc kubenswrapper[4792]: I0309 09:45:00.931992 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k"] Mar 09 09:45:01 crc kubenswrapper[4792]: I0309 09:45:01.222692 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" event={"ID":"850e1ee7-b846-45b3-97ff-b33b9a1c6c93","Type":"ContainerStarted","Data":"cbc19e876a72df0f2d6103262413afd747151063a3bf2ac2163c9a2e003ee718"} Mar 09 09:45:01 crc kubenswrapper[4792]: I0309 09:45:01.223955 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" event={"ID":"850e1ee7-b846-45b3-97ff-b33b9a1c6c93","Type":"ContainerStarted","Data":"2857a801a47eedc9a8de8d543a2131ad3aa61830b239479bab1a4b213f695799"} Mar 09 09:45:01 crc kubenswrapper[4792]: I0309 09:45:01.246820 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" podStartSLOduration=1.246802503 podStartE2EDuration="1.246802503s" podCreationTimestamp="2026-03-09 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:45:01.244519455 +0000 UTC m=+2266.274720207" watchObservedRunningTime="2026-03-09 09:45:01.246802503 +0000 UTC m=+2266.277003255" Mar 09 09:45:02 crc kubenswrapper[4792]: I0309 09:45:02.230967 4792 generic.go:334] "Generic (PLEG): container finished" podID="850e1ee7-b846-45b3-97ff-b33b9a1c6c93" containerID="cbc19e876a72df0f2d6103262413afd747151063a3bf2ac2163c9a2e003ee718" exitCode=0 Mar 09 09:45:02 crc kubenswrapper[4792]: I0309 09:45:02.231018 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" event={"ID":"850e1ee7-b846-45b3-97ff-b33b9a1c6c93","Type":"ContainerDied","Data":"cbc19e876a72df0f2d6103262413afd747151063a3bf2ac2163c9a2e003ee718"} Mar 09 09:45:03 crc kubenswrapper[4792]: I0309 09:45:03.618249 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" Mar 09 09:45:03 crc kubenswrapper[4792]: I0309 09:45:03.714858 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-secret-volume\") pod \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\" (UID: \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\") " Mar 09 09:45:03 crc kubenswrapper[4792]: I0309 09:45:03.714956 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-config-volume\") pod \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\" (UID: \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\") " Mar 09 09:45:03 crc kubenswrapper[4792]: I0309 09:45:03.715036 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggjtf\" (UniqueName: \"kubernetes.io/projected/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-kube-api-access-ggjtf\") pod \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\" (UID: \"850e1ee7-b846-45b3-97ff-b33b9a1c6c93\") " Mar 09 09:45:03 crc kubenswrapper[4792]: I0309 09:45:03.716723 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-config-volume" (OuterVolumeSpecName: "config-volume") pod "850e1ee7-b846-45b3-97ff-b33b9a1c6c93" (UID: "850e1ee7-b846-45b3-97ff-b33b9a1c6c93"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:45:03 crc kubenswrapper[4792]: I0309 09:45:03.717215 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:03 crc kubenswrapper[4792]: I0309 09:45:03.722895 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-kube-api-access-ggjtf" (OuterVolumeSpecName: "kube-api-access-ggjtf") pod "850e1ee7-b846-45b3-97ff-b33b9a1c6c93" (UID: "850e1ee7-b846-45b3-97ff-b33b9a1c6c93"). InnerVolumeSpecName "kube-api-access-ggjtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:45:03 crc kubenswrapper[4792]: I0309 09:45:03.725548 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "850e1ee7-b846-45b3-97ff-b33b9a1c6c93" (UID: "850e1ee7-b846-45b3-97ff-b33b9a1c6c93"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:45:03 crc kubenswrapper[4792]: I0309 09:45:03.820592 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggjtf\" (UniqueName: \"kubernetes.io/projected/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-kube-api-access-ggjtf\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:03 crc kubenswrapper[4792]: I0309 09:45:03.820625 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/850e1ee7-b846-45b3-97ff-b33b9a1c6c93-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:04 crc kubenswrapper[4792]: I0309 09:45:04.249385 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" event={"ID":"850e1ee7-b846-45b3-97ff-b33b9a1c6c93","Type":"ContainerDied","Data":"2857a801a47eedc9a8de8d543a2131ad3aa61830b239479bab1a4b213f695799"} Mar 09 09:45:04 crc kubenswrapper[4792]: I0309 09:45:04.249451 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2857a801a47eedc9a8de8d543a2131ad3aa61830b239479bab1a4b213f695799" Mar 09 09:45:04 crc kubenswrapper[4792]: I0309 09:45:04.249533 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k" Mar 09 09:45:04 crc kubenswrapper[4792]: I0309 09:45:04.332249 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550780-48757"] Mar 09 09:45:04 crc kubenswrapper[4792]: I0309 09:45:04.341216 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550780-48757"] Mar 09 09:45:05 crc kubenswrapper[4792]: I0309 09:45:05.678671 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aadae5cd-e840-4618-b021-d8ca0e9169bd" path="/var/lib/kubelet/pods/aadae5cd-e840-4618-b021-d8ca0e9169bd/volumes" Mar 09 09:45:13 crc kubenswrapper[4792]: I0309 09:45:13.214149 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:45:13 crc kubenswrapper[4792]: I0309 09:45:13.214799 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:45:13 crc kubenswrapper[4792]: I0309 09:45:13.214852 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:45:13 crc kubenswrapper[4792]: I0309 09:45:13.215493 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:45:13 crc kubenswrapper[4792]: I0309 09:45:13.215562 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" gracePeriod=600 Mar 09 09:45:13 crc kubenswrapper[4792]: E0309 09:45:13.349970 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:45:14 crc kubenswrapper[4792]: I0309 09:45:14.332831 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" exitCode=0 Mar 09 09:45:14 crc kubenswrapper[4792]: I0309 09:45:14.332900 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368"} Mar 09 09:45:14 crc kubenswrapper[4792]: I0309 09:45:14.332936 4792 scope.go:117] "RemoveContainer" containerID="6f5e8f72786750c34390c3921b89a9937cb9dce363f8ebe7772b408843cd4da4" Mar 09 09:45:14 crc kubenswrapper[4792]: I0309 09:45:14.333493 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:45:14 crc kubenswrapper[4792]: E0309 09:45:14.333764 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:45:14 crc kubenswrapper[4792]: I0309 09:45:14.336863 4792 generic.go:334] "Generic (PLEG): container finished" podID="bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a" containerID="b55d1d4d92cc0fa449923074665ecd59b184346d06f8970ac5bea0fc63bb74b2" exitCode=0 Mar 09 09:45:14 crc kubenswrapper[4792]: I0309 09:45:14.336909 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" event={"ID":"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a","Type":"ContainerDied","Data":"b55d1d4d92cc0fa449923074665ecd59b184346d06f8970ac5bea0fc63bb74b2"} Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.732798 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.842877 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-ceph\") pod \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.842962 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-repo-setup-combined-ca-bundle\") pod \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.843124 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2jtv\" (UniqueName: \"kubernetes.io/projected/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-kube-api-access-v2jtv\") pod \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.843180 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-inventory\") pod \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.843306 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-ssh-key-openstack-edpm-ipam\") pod \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\" (UID: \"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a\") " Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.848355 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-kube-api-access-v2jtv" (OuterVolumeSpecName: "kube-api-access-v2jtv") pod "bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a" (UID: "bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a"). InnerVolumeSpecName "kube-api-access-v2jtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.848904 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a" (UID: "bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.857401 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-ceph" (OuterVolumeSpecName: "ceph") pod "bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a" (UID: "bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.871082 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-inventory" (OuterVolumeSpecName: "inventory") pod "bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a" (UID: "bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.872516 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a" (UID: "bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.946408 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2jtv\" (UniqueName: \"kubernetes.io/projected/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-kube-api-access-v2jtv\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.946446 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.946459 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.946502 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:15 crc kubenswrapper[4792]: I0309 09:45:15.946517 4792 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.362140 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" event={"ID":"bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a","Type":"ContainerDied","Data":"b224047b930d4997375d35b53f5a3201325a6c46e54da9e55f5de91ad2d4ca75"} Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.362180 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b224047b930d4997375d35b53f5a3201325a6c46e54da9e55f5de91ad2d4ca75" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.362230 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.451861 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm"] Mar 09 09:45:16 crc kubenswrapper[4792]: E0309 09:45:16.452293 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850e1ee7-b846-45b3-97ff-b33b9a1c6c93" containerName="collect-profiles" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.452310 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="850e1ee7-b846-45b3-97ff-b33b9a1c6c93" containerName="collect-profiles" Mar 09 09:45:16 crc kubenswrapper[4792]: E0309 09:45:16.452336 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.452346 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.452497 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="850e1ee7-b846-45b3-97ff-b33b9a1c6c93" containerName="collect-profiles" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.452516 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.453089 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.457695 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.457794 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.457942 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.458224 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.460210 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.464312 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm"] Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.557054 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkj5x\" (UniqueName: \"kubernetes.io/projected/90585516-71d1-4289-8f0d-43884caee227-kube-api-access-nkj5x\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.557236 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.557483 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.557641 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.557752 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.659197 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.659321 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.659384 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.659443 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.659523 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkj5x\" (UniqueName: \"kubernetes.io/projected/90585516-71d1-4289-8f0d-43884caee227-kube-api-access-nkj5x\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.663666 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.663745 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.664679 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.665885 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.678574 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkj5x\" (UniqueName: \"kubernetes.io/projected/90585516-71d1-4289-8f0d-43884caee227-kube-api-access-nkj5x\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:16 crc kubenswrapper[4792]: I0309 09:45:16.771441 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:45:17 crc kubenswrapper[4792]: I0309 09:45:17.128276 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm"] Mar 09 09:45:17 crc kubenswrapper[4792]: I0309 09:45:17.370393 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" event={"ID":"90585516-71d1-4289-8f0d-43884caee227","Type":"ContainerStarted","Data":"19636c9c7496b14f139921a335d5b3defb327b01ca29ad60573fe3918faeef60"} Mar 09 09:45:18 crc kubenswrapper[4792]: I0309 09:45:18.380928 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" event={"ID":"90585516-71d1-4289-8f0d-43884caee227","Type":"ContainerStarted","Data":"46944a8932d6c36012dc01be189ed92a3266a383f41aaa8e93d0bc8171e2f172"} Mar 09 09:45:18 crc kubenswrapper[4792]: I0309 09:45:18.405201 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" podStartSLOduration=1.957525923 podStartE2EDuration="2.405181369s" podCreationTimestamp="2026-03-09 09:45:16 +0000 UTC" firstStartedPulling="2026-03-09 09:45:17.138515972 +0000 UTC m=+2282.168716724" lastFinishedPulling="2026-03-09 09:45:17.586171418 +0000 UTC m=+2282.616372170" observedRunningTime="2026-03-09 09:45:18.400361776 +0000 UTC m=+2283.430562528" watchObservedRunningTime="2026-03-09 09:45:18.405181369 +0000 UTC m=+2283.435382121" Mar 09 09:45:25 crc kubenswrapper[4792]: I0309 09:45:25.668056 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:45:25 crc kubenswrapper[4792]: E0309 09:45:25.669125 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:45:36 crc kubenswrapper[4792]: I0309 09:45:36.672744 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:45:36 crc kubenswrapper[4792]: E0309 09:45:36.673769 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:45:39 crc kubenswrapper[4792]: I0309 09:45:39.011021 4792 scope.go:117] "RemoveContainer" containerID="3af016df023633048f7a8fd280fffb4aa745eb6a9f65fc6a9b4c578654cfb6cf" Mar 09 09:45:39 crc kubenswrapper[4792]: I0309 09:45:39.054967 4792 scope.go:117] "RemoveContainer" containerID="0d2ba3a4a18a680ca58f3850cfa2b34550efc1260f9d446afb07dca113766b9b" Mar 09 09:45:39 crc kubenswrapper[4792]: I0309 09:45:39.152560 4792 scope.go:117] "RemoveContainer" containerID="73caaf373715300262a340afc5b3fd4cd5f21307a60c1cc6a6e77576ee1cd749" Mar 09 09:45:39 crc kubenswrapper[4792]: I0309 09:45:39.182939 4792 scope.go:117] "RemoveContainer" containerID="283b5025e57ea97c4dd7b984e88288fbfd73dbe246ca3559abdea0175d56c3b4" Mar 09 09:45:39 crc kubenswrapper[4792]: I0309 09:45:39.221215 4792 scope.go:117] "RemoveContainer" containerID="9a835914b405660246ab4fb83ec4285a3ed5ea8d17add8fc5583b2c3f86e82f8" Mar 09 09:45:39 crc kubenswrapper[4792]: I0309 09:45:39.277099 4792 scope.go:117] "RemoveContainer" containerID="f9b8075982af5100f219dd254519620c93554beab3eda315267691126a6c860e" Mar 09 09:45:39 crc kubenswrapper[4792]: I0309 09:45:39.310093 4792 scope.go:117] "RemoveContainer" containerID="7ef4eff7b492037a81e29ce5a84bb0e36f8ba627b5bc92b5d5936075c95f43ef" Mar 09 09:45:39 crc kubenswrapper[4792]: I0309 09:45:39.372295 4792 scope.go:117] "RemoveContainer" containerID="2765bf8e7416ce9a5c4bc8417a0fd5123045d8266916cf4788f20e8d189e18c7" Mar 09 09:45:39 crc kubenswrapper[4792]: I0309 09:45:39.440351 4792 scope.go:117] "RemoveContainer" containerID="76e6e59a6cbe9324a897226295db80b5750be1f56f01c7c67683d9eb2261880a" Mar 09 09:45:39 crc kubenswrapper[4792]: I0309 09:45:39.481451 4792 scope.go:117] "RemoveContainer" containerID="d104d2dc86f9cb4cd3a61c24abc309fe0728bfe4b26d245e57c4e0a1793f065c" Mar 09 09:45:39 crc kubenswrapper[4792]: I0309 09:45:39.501442 4792 scope.go:117] "RemoveContainer" containerID="37dcf57975b6d388083f0f16b20d85f151d1702d8448266cf9ef436edcae57e2" Mar 09 09:45:51 crc kubenswrapper[4792]: I0309 09:45:51.662398 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:45:51 crc kubenswrapper[4792]: E0309 09:45:51.663165 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:46:00 crc kubenswrapper[4792]: I0309 09:46:00.150309 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550826-xl478"] Mar 09 09:46:00 crc kubenswrapper[4792]: I0309 09:46:00.153721 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550826-xl478" Mar 09 09:46:00 crc kubenswrapper[4792]: I0309 09:46:00.156476 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:46:00 crc kubenswrapper[4792]: I0309 09:46:00.156477 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:46:00 crc kubenswrapper[4792]: I0309 09:46:00.157293 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:46:00 crc kubenswrapper[4792]: I0309 09:46:00.163138 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550826-xl478"] Mar 09 09:46:00 crc kubenswrapper[4792]: I0309 09:46:00.245391 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfb5f\" (UniqueName: \"kubernetes.io/projected/37698dc4-643b-47b2-94a7-7a6be39c9bd2-kube-api-access-nfb5f\") pod \"auto-csr-approver-29550826-xl478\" (UID: \"37698dc4-643b-47b2-94a7-7a6be39c9bd2\") " pod="openshift-infra/auto-csr-approver-29550826-xl478" Mar 09 09:46:00 crc kubenswrapper[4792]: I0309 09:46:00.347184 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfb5f\" (UniqueName: \"kubernetes.io/projected/37698dc4-643b-47b2-94a7-7a6be39c9bd2-kube-api-access-nfb5f\") pod \"auto-csr-approver-29550826-xl478\" (UID: \"37698dc4-643b-47b2-94a7-7a6be39c9bd2\") " pod="openshift-infra/auto-csr-approver-29550826-xl478" Mar 09 09:46:00 crc kubenswrapper[4792]: I0309 09:46:00.367021 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfb5f\" (UniqueName: \"kubernetes.io/projected/37698dc4-643b-47b2-94a7-7a6be39c9bd2-kube-api-access-nfb5f\") pod \"auto-csr-approver-29550826-xl478\" (UID: \"37698dc4-643b-47b2-94a7-7a6be39c9bd2\") " pod="openshift-infra/auto-csr-approver-29550826-xl478" Mar 09 09:46:00 crc kubenswrapper[4792]: I0309 09:46:00.476903 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550826-xl478" Mar 09 09:46:00 crc kubenswrapper[4792]: I0309 09:46:00.902233 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550826-xl478"] Mar 09 09:46:01 crc kubenswrapper[4792]: I0309 09:46:01.722005 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550826-xl478" event={"ID":"37698dc4-643b-47b2-94a7-7a6be39c9bd2","Type":"ContainerStarted","Data":"806572e5019c3476068733c3061e8a9d0321be83015f4698df3d6d71cf2385b8"} Mar 09 09:46:02 crc kubenswrapper[4792]: I0309 09:46:02.730278 4792 generic.go:334] "Generic (PLEG): container finished" podID="37698dc4-643b-47b2-94a7-7a6be39c9bd2" containerID="53ecba3a9eb9472e76a04b73068aa4a370428c39f236c74a55bd9ce0a0302e23" exitCode=0 Mar 09 09:46:02 crc kubenswrapper[4792]: I0309 09:46:02.730466 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550826-xl478" event={"ID":"37698dc4-643b-47b2-94a7-7a6be39c9bd2","Type":"ContainerDied","Data":"53ecba3a9eb9472e76a04b73068aa4a370428c39f236c74a55bd9ce0a0302e23"} Mar 09 09:46:03 crc kubenswrapper[4792]: I0309 09:46:03.663326 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:46:03 crc kubenswrapper[4792]: E0309 09:46:03.663891 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:46:04 crc kubenswrapper[4792]: I0309 09:46:04.101295 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550826-xl478" Mar 09 09:46:04 crc kubenswrapper[4792]: I0309 09:46:04.216509 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfb5f\" (UniqueName: \"kubernetes.io/projected/37698dc4-643b-47b2-94a7-7a6be39c9bd2-kube-api-access-nfb5f\") pod \"37698dc4-643b-47b2-94a7-7a6be39c9bd2\" (UID: \"37698dc4-643b-47b2-94a7-7a6be39c9bd2\") " Mar 09 09:46:04 crc kubenswrapper[4792]: I0309 09:46:04.223563 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37698dc4-643b-47b2-94a7-7a6be39c9bd2-kube-api-access-nfb5f" (OuterVolumeSpecName: "kube-api-access-nfb5f") pod "37698dc4-643b-47b2-94a7-7a6be39c9bd2" (UID: "37698dc4-643b-47b2-94a7-7a6be39c9bd2"). InnerVolumeSpecName "kube-api-access-nfb5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:46:04 crc kubenswrapper[4792]: I0309 09:46:04.319103 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfb5f\" (UniqueName: \"kubernetes.io/projected/37698dc4-643b-47b2-94a7-7a6be39c9bd2-kube-api-access-nfb5f\") on node \"crc\" DevicePath \"\"" Mar 09 09:46:04 crc kubenswrapper[4792]: I0309 09:46:04.750868 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550826-xl478" event={"ID":"37698dc4-643b-47b2-94a7-7a6be39c9bd2","Type":"ContainerDied","Data":"806572e5019c3476068733c3061e8a9d0321be83015f4698df3d6d71cf2385b8"} Mar 09 09:46:04 crc kubenswrapper[4792]: I0309 09:46:04.750911 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550826-xl478" Mar 09 09:46:04 crc kubenswrapper[4792]: I0309 09:46:04.750922 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="806572e5019c3476068733c3061e8a9d0321be83015f4698df3d6d71cf2385b8" Mar 09 09:46:05 crc kubenswrapper[4792]: I0309 09:46:05.179516 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550820-lnfhn"] Mar 09 09:46:05 crc kubenswrapper[4792]: I0309 09:46:05.187190 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550820-lnfhn"] Mar 09 09:46:05 crc kubenswrapper[4792]: I0309 09:46:05.671907 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6cb8b6-189a-41ff-aed8-95729f1ca01c" path="/var/lib/kubelet/pods/ea6cb8b6-189a-41ff-aed8-95729f1ca01c/volumes" Mar 09 09:46:18 crc kubenswrapper[4792]: I0309 09:46:18.662997 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:46:18 crc kubenswrapper[4792]: E0309 09:46:18.663876 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:46:32 crc kubenswrapper[4792]: I0309 09:46:32.662785 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:46:32 crc kubenswrapper[4792]: E0309 09:46:32.664601 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:46:39 crc kubenswrapper[4792]: I0309 09:46:39.726227 4792 scope.go:117] "RemoveContainer" containerID="712c4d8c4bf174cf9a0a715557728bf065ae207a6fef40b6db706c5fae954134" Mar 09 09:46:45 crc kubenswrapper[4792]: I0309 09:46:45.669803 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:46:45 crc kubenswrapper[4792]: E0309 09:46:45.670793 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:46:56 crc kubenswrapper[4792]: I0309 09:46:56.662543 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:46:56 crc kubenswrapper[4792]: E0309 09:46:56.663464 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:47:07 crc kubenswrapper[4792]: I0309 09:47:07.663780 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:47:07 crc kubenswrapper[4792]: E0309 09:47:07.664673 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:47:22 crc kubenswrapper[4792]: I0309 09:47:22.662585 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:47:22 crc kubenswrapper[4792]: E0309 09:47:22.663431 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:47:28 crc kubenswrapper[4792]: I0309 09:47:28.393173 4792 generic.go:334] "Generic (PLEG): container finished" podID="90585516-71d1-4289-8f0d-43884caee227" containerID="46944a8932d6c36012dc01be189ed92a3266a383f41aaa8e93d0bc8171e2f172" exitCode=0 Mar 09 09:47:28 crc kubenswrapper[4792]: I0309 09:47:28.393264 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" event={"ID":"90585516-71d1-4289-8f0d-43884caee227","Type":"ContainerDied","Data":"46944a8932d6c36012dc01be189ed92a3266a383f41aaa8e93d0bc8171e2f172"} Mar 09 09:47:29 crc kubenswrapper[4792]: I0309 09:47:29.826838 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:47:29 crc kubenswrapper[4792]: I0309 09:47:29.948490 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-inventory\") pod \"90585516-71d1-4289-8f0d-43884caee227\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " Mar 09 09:47:29 crc kubenswrapper[4792]: I0309 09:47:29.948588 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-ssh-key-openstack-edpm-ipam\") pod \"90585516-71d1-4289-8f0d-43884caee227\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " Mar 09 09:47:29 crc kubenswrapper[4792]: I0309 09:47:29.948651 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-ceph\") pod \"90585516-71d1-4289-8f0d-43884caee227\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " Mar 09 09:47:29 crc kubenswrapper[4792]: I0309 09:47:29.948698 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkj5x\" (UniqueName: \"kubernetes.io/projected/90585516-71d1-4289-8f0d-43884caee227-kube-api-access-nkj5x\") pod \"90585516-71d1-4289-8f0d-43884caee227\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " Mar 09 09:47:29 crc kubenswrapper[4792]: I0309 09:47:29.948775 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-bootstrap-combined-ca-bundle\") pod \"90585516-71d1-4289-8f0d-43884caee227\" (UID: \"90585516-71d1-4289-8f0d-43884caee227\") " Mar 09 09:47:29 crc kubenswrapper[4792]: I0309 09:47:29.955570 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "90585516-71d1-4289-8f0d-43884caee227" (UID: "90585516-71d1-4289-8f0d-43884caee227"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:47:29 crc kubenswrapper[4792]: I0309 09:47:29.956692 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90585516-71d1-4289-8f0d-43884caee227-kube-api-access-nkj5x" (OuterVolumeSpecName: "kube-api-access-nkj5x") pod "90585516-71d1-4289-8f0d-43884caee227" (UID: "90585516-71d1-4289-8f0d-43884caee227"). InnerVolumeSpecName "kube-api-access-nkj5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:47:29 crc kubenswrapper[4792]: I0309 09:47:29.961271 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-ceph" (OuterVolumeSpecName: "ceph") pod "90585516-71d1-4289-8f0d-43884caee227" (UID: "90585516-71d1-4289-8f0d-43884caee227"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:47:29 crc kubenswrapper[4792]: I0309 09:47:29.983095 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-inventory" (OuterVolumeSpecName: "inventory") pod "90585516-71d1-4289-8f0d-43884caee227" (UID: "90585516-71d1-4289-8f0d-43884caee227"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:47:29 crc kubenswrapper[4792]: I0309 09:47:29.996152 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "90585516-71d1-4289-8f0d-43884caee227" (UID: "90585516-71d1-4289-8f0d-43884caee227"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.051692 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.051742 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.051758 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.051769 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkj5x\" (UniqueName: \"kubernetes.io/projected/90585516-71d1-4289-8f0d-43884caee227-kube-api-access-nkj5x\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.051782 4792 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90585516-71d1-4289-8f0d-43884caee227-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.408092 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" event={"ID":"90585516-71d1-4289-8f0d-43884caee227","Type":"ContainerDied","Data":"19636c9c7496b14f139921a335d5b3defb327b01ca29ad60573fe3918faeef60"} Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.408132 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19636c9c7496b14f139921a335d5b3defb327b01ca29ad60573fe3918faeef60" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.408184 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.514626 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt"] Mar 09 09:47:30 crc kubenswrapper[4792]: E0309 09:47:30.514995 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37698dc4-643b-47b2-94a7-7a6be39c9bd2" containerName="oc" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.515013 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="37698dc4-643b-47b2-94a7-7a6be39c9bd2" containerName="oc" Mar 09 09:47:30 crc kubenswrapper[4792]: E0309 09:47:30.515033 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90585516-71d1-4289-8f0d-43884caee227" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.515040 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="90585516-71d1-4289-8f0d-43884caee227" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.515204 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="37698dc4-643b-47b2-94a7-7a6be39c9bd2" containerName="oc" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.515217 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="90585516-71d1-4289-8f0d-43884caee227" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.515772 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.517825 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.524520 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.524809 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.524931 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.525042 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.539319 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt"] Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.661964 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.662055 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tmnt\" (UniqueName: \"kubernetes.io/projected/1bedab89-65bc-478a-abf1-3e3429951e71-kube-api-access-7tmnt\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.662100 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.664738 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.766687 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.766811 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tmnt\" (UniqueName: \"kubernetes.io/projected/1bedab89-65bc-478a-abf1-3e3429951e71-kube-api-access-7tmnt\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.766852 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.766887 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.775809 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.776050 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.781611 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.786568 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tmnt\" (UniqueName: \"kubernetes.io/projected/1bedab89-65bc-478a-abf1-3e3429951e71-kube-api-access-7tmnt\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:30 crc kubenswrapper[4792]: I0309 09:47:30.831543 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:47:31 crc kubenswrapper[4792]: I0309 09:47:31.343817 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt"] Mar 09 09:47:31 crc kubenswrapper[4792]: I0309 09:47:31.351618 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:47:31 crc kubenswrapper[4792]: I0309 09:47:31.416638 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" event={"ID":"1bedab89-65bc-478a-abf1-3e3429951e71","Type":"ContainerStarted","Data":"5b70590477662309001630a6e59f0c587d46db2139a8041b4375cd477000c8da"} Mar 09 09:47:32 crc kubenswrapper[4792]: I0309 09:47:32.425479 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" event={"ID":"1bedab89-65bc-478a-abf1-3e3429951e71","Type":"ContainerStarted","Data":"8c398fd1fc6f71e9e4fd7b4625c181b5e6e2bf00c9d8d0bd48fa1e508f2c84cd"} Mar 09 09:47:32 crc kubenswrapper[4792]: I0309 09:47:32.450401 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" podStartSLOduration=1.939806231 podStartE2EDuration="2.450379016s" podCreationTimestamp="2026-03-09 09:47:30 +0000 UTC" firstStartedPulling="2026-03-09 09:47:31.351367341 +0000 UTC m=+2416.381568093" lastFinishedPulling="2026-03-09 09:47:31.861940126 +0000 UTC m=+2416.892140878" observedRunningTime="2026-03-09 09:47:32.440997174 +0000 UTC m=+2417.471197926" watchObservedRunningTime="2026-03-09 09:47:32.450379016 +0000 UTC m=+2417.480579778" Mar 09 09:47:36 crc kubenswrapper[4792]: I0309 09:47:36.663409 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:47:36 crc kubenswrapper[4792]: E0309 09:47:36.664778 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:47:49 crc kubenswrapper[4792]: I0309 09:47:49.663201 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:47:49 crc kubenswrapper[4792]: E0309 09:47:49.663999 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:47:58 crc kubenswrapper[4792]: I0309 09:47:58.626141 4792 generic.go:334] "Generic (PLEG): container finished" podID="1bedab89-65bc-478a-abf1-3e3429951e71" containerID="8c398fd1fc6f71e9e4fd7b4625c181b5e6e2bf00c9d8d0bd48fa1e508f2c84cd" exitCode=0 Mar 09 09:47:58 crc kubenswrapper[4792]: I0309 09:47:58.626250 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" event={"ID":"1bedab89-65bc-478a-abf1-3e3429951e71","Type":"ContainerDied","Data":"8c398fd1fc6f71e9e4fd7b4625c181b5e6e2bf00c9d8d0bd48fa1e508f2c84cd"} Mar 09 09:47:59 crc kubenswrapper[4792]: I0309 09:47:59.998852 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.081823 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tmnt\" (UniqueName: \"kubernetes.io/projected/1bedab89-65bc-478a-abf1-3e3429951e71-kube-api-access-7tmnt\") pod \"1bedab89-65bc-478a-abf1-3e3429951e71\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.081940 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-ssh-key-openstack-edpm-ipam\") pod \"1bedab89-65bc-478a-abf1-3e3429951e71\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.082048 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-ceph\") pod \"1bedab89-65bc-478a-abf1-3e3429951e71\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.082148 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-inventory\") pod \"1bedab89-65bc-478a-abf1-3e3429951e71\" (UID: \"1bedab89-65bc-478a-abf1-3e3429951e71\") " Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.088379 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-ceph" (OuterVolumeSpecName: "ceph") pod "1bedab89-65bc-478a-abf1-3e3429951e71" (UID: "1bedab89-65bc-478a-abf1-3e3429951e71"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.088798 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bedab89-65bc-478a-abf1-3e3429951e71-kube-api-access-7tmnt" (OuterVolumeSpecName: "kube-api-access-7tmnt") pod "1bedab89-65bc-478a-abf1-3e3429951e71" (UID: "1bedab89-65bc-478a-abf1-3e3429951e71"). InnerVolumeSpecName "kube-api-access-7tmnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.110678 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-inventory" (OuterVolumeSpecName: "inventory") pod "1bedab89-65bc-478a-abf1-3e3429951e71" (UID: "1bedab89-65bc-478a-abf1-3e3429951e71"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.112340 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1bedab89-65bc-478a-abf1-3e3429951e71" (UID: "1bedab89-65bc-478a-abf1-3e3429951e71"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.147432 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550828-c7c44"] Mar 09 09:48:00 crc kubenswrapper[4792]: E0309 09:48:00.147946 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bedab89-65bc-478a-abf1-3e3429951e71" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.147989 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bedab89-65bc-478a-abf1-3e3429951e71" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.148237 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bedab89-65bc-478a-abf1-3e3429951e71" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.148985 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550828-c7c44" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.152024 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.152177 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.152938 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.170295 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550828-c7c44"] Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.186698 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.186726 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.186737 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tmnt\" (UniqueName: \"kubernetes.io/projected/1bedab89-65bc-478a-abf1-3e3429951e71-kube-api-access-7tmnt\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.186746 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bedab89-65bc-478a-abf1-3e3429951e71-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.288978 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc4nz\" (UniqueName: \"kubernetes.io/projected/13d484cf-4b10-4b25-978f-8149fd45aa5b-kube-api-access-qc4nz\") pod \"auto-csr-approver-29550828-c7c44\" (UID: \"13d484cf-4b10-4b25-978f-8149fd45aa5b\") " pod="openshift-infra/auto-csr-approver-29550828-c7c44" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.390553 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc4nz\" (UniqueName: \"kubernetes.io/projected/13d484cf-4b10-4b25-978f-8149fd45aa5b-kube-api-access-qc4nz\") pod \"auto-csr-approver-29550828-c7c44\" (UID: \"13d484cf-4b10-4b25-978f-8149fd45aa5b\") " pod="openshift-infra/auto-csr-approver-29550828-c7c44" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.415848 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc4nz\" (UniqueName: \"kubernetes.io/projected/13d484cf-4b10-4b25-978f-8149fd45aa5b-kube-api-access-qc4nz\") pod \"auto-csr-approver-29550828-c7c44\" (UID: \"13d484cf-4b10-4b25-978f-8149fd45aa5b\") " pod="openshift-infra/auto-csr-approver-29550828-c7c44" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.497062 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550828-c7c44" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.649089 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" event={"ID":"1bedab89-65bc-478a-abf1-3e3429951e71","Type":"ContainerDied","Data":"5b70590477662309001630a6e59f0c587d46db2139a8041b4375cd477000c8da"} Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.649134 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b70590477662309001630a6e59f0c587d46db2139a8041b4375cd477000c8da" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.649187 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.742404 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86"] Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.750281 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.754629 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.754914 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.755207 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.755334 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.758782 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86"] Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.759210 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.799050 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vwq86\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.799227 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vwq86\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.799315 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vwq86\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.799391 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsxlk\" (UniqueName: \"kubernetes.io/projected/c83b88c6-39ae-4077-84b9-e10f71a53d6e-kube-api-access-vsxlk\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vwq86\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.901036 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vwq86\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.901164 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsxlk\" (UniqueName: \"kubernetes.io/projected/c83b88c6-39ae-4077-84b9-e10f71a53d6e-kube-api-access-vsxlk\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vwq86\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.901604 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vwq86\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.902110 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vwq86\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.906806 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vwq86\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.906826 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vwq86\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.906857 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vwq86\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.921546 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsxlk\" (UniqueName: \"kubernetes.io/projected/c83b88c6-39ae-4077-84b9-e10f71a53d6e-kube-api-access-vsxlk\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vwq86\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:00 crc kubenswrapper[4792]: I0309 09:48:00.938006 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550828-c7c44"] Mar 09 09:48:01 crc kubenswrapper[4792]: I0309 09:48:01.076289 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:01 crc kubenswrapper[4792]: I0309 09:48:01.626347 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86"] Mar 09 09:48:01 crc kubenswrapper[4792]: I0309 09:48:01.659143 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550828-c7c44" event={"ID":"13d484cf-4b10-4b25-978f-8149fd45aa5b","Type":"ContainerStarted","Data":"d608a74fed2bbbb1a23d6877852b3587d484c0be8aae81d556775b9657a3f6c3"} Mar 09 09:48:01 crc kubenswrapper[4792]: I0309 09:48:01.660566 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" event={"ID":"c83b88c6-39ae-4077-84b9-e10f71a53d6e","Type":"ContainerStarted","Data":"b7357492fbc6d42a45b66156cf069b369410892943e45a9b9dc56323280d89d0"} Mar 09 09:48:02 crc kubenswrapper[4792]: I0309 09:48:02.675231 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" event={"ID":"c83b88c6-39ae-4077-84b9-e10f71a53d6e","Type":"ContainerStarted","Data":"9fc5ff55b7895a755319b0492abfdb4800c674cd6f56b0ff453ceeaacec076e1"} Mar 09 09:48:02 crc kubenswrapper[4792]: I0309 09:48:02.704675 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" podStartSLOduration=2.143692714 podStartE2EDuration="2.704651856s" podCreationTimestamp="2026-03-09 09:48:00 +0000 UTC" firstStartedPulling="2026-03-09 09:48:01.636663421 +0000 UTC m=+2446.666864173" lastFinishedPulling="2026-03-09 09:48:02.197622563 +0000 UTC m=+2447.227823315" observedRunningTime="2026-03-09 09:48:02.691992591 +0000 UTC m=+2447.722193363" watchObservedRunningTime="2026-03-09 09:48:02.704651856 +0000 UTC m=+2447.734852608" Mar 09 09:48:03 crc kubenswrapper[4792]: I0309 09:48:03.662231 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:48:03 crc kubenswrapper[4792]: E0309 09:48:03.662682 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:48:03 crc kubenswrapper[4792]: I0309 09:48:03.683801 4792 generic.go:334] "Generic (PLEG): container finished" podID="13d484cf-4b10-4b25-978f-8149fd45aa5b" containerID="c8eaf420631c9857a7de5ece3313e25cf0ae9629e901d3a967bbcdd325f9e344" exitCode=0 Mar 09 09:48:03 crc kubenswrapper[4792]: I0309 09:48:03.683839 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550828-c7c44" event={"ID":"13d484cf-4b10-4b25-978f-8149fd45aa5b","Type":"ContainerDied","Data":"c8eaf420631c9857a7de5ece3313e25cf0ae9629e901d3a967bbcdd325f9e344"} Mar 09 09:48:05 crc kubenswrapper[4792]: I0309 09:48:05.053548 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550828-c7c44" Mar 09 09:48:05 crc kubenswrapper[4792]: I0309 09:48:05.189607 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc4nz\" (UniqueName: \"kubernetes.io/projected/13d484cf-4b10-4b25-978f-8149fd45aa5b-kube-api-access-qc4nz\") pod \"13d484cf-4b10-4b25-978f-8149fd45aa5b\" (UID: \"13d484cf-4b10-4b25-978f-8149fd45aa5b\") " Mar 09 09:48:05 crc kubenswrapper[4792]: I0309 09:48:05.197440 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13d484cf-4b10-4b25-978f-8149fd45aa5b-kube-api-access-qc4nz" (OuterVolumeSpecName: "kube-api-access-qc4nz") pod "13d484cf-4b10-4b25-978f-8149fd45aa5b" (UID: "13d484cf-4b10-4b25-978f-8149fd45aa5b"). InnerVolumeSpecName "kube-api-access-qc4nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:48:05 crc kubenswrapper[4792]: I0309 09:48:05.291367 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc4nz\" (UniqueName: \"kubernetes.io/projected/13d484cf-4b10-4b25-978f-8149fd45aa5b-kube-api-access-qc4nz\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:05 crc kubenswrapper[4792]: I0309 09:48:05.699622 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550828-c7c44" event={"ID":"13d484cf-4b10-4b25-978f-8149fd45aa5b","Type":"ContainerDied","Data":"d608a74fed2bbbb1a23d6877852b3587d484c0be8aae81d556775b9657a3f6c3"} Mar 09 09:48:05 crc kubenswrapper[4792]: I0309 09:48:05.699953 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d608a74fed2bbbb1a23d6877852b3587d484c0be8aae81d556775b9657a3f6c3" Mar 09 09:48:05 crc kubenswrapper[4792]: I0309 09:48:05.699664 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550828-c7c44" Mar 09 09:48:06 crc kubenswrapper[4792]: I0309 09:48:06.132142 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550822-57bjc"] Mar 09 09:48:06 crc kubenswrapper[4792]: I0309 09:48:06.139837 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550822-57bjc"] Mar 09 09:48:07 crc kubenswrapper[4792]: I0309 09:48:07.695916 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a30492-5834-4ceb-88c2-24fb04d95752" path="/var/lib/kubelet/pods/91a30492-5834-4ceb-88c2-24fb04d95752/volumes" Mar 09 09:48:07 crc kubenswrapper[4792]: I0309 09:48:07.716737 4792 generic.go:334] "Generic (PLEG): container finished" podID="c83b88c6-39ae-4077-84b9-e10f71a53d6e" containerID="9fc5ff55b7895a755319b0492abfdb4800c674cd6f56b0ff453ceeaacec076e1" exitCode=0 Mar 09 09:48:07 crc kubenswrapper[4792]: I0309 09:48:07.716786 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" event={"ID":"c83b88c6-39ae-4077-84b9-e10f71a53d6e","Type":"ContainerDied","Data":"9fc5ff55b7895a755319b0492abfdb4800c674cd6f56b0ff453ceeaacec076e1"} Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.118583 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.287674 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsxlk\" (UniqueName: \"kubernetes.io/projected/c83b88c6-39ae-4077-84b9-e10f71a53d6e-kube-api-access-vsxlk\") pod \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.287747 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-ssh-key-openstack-edpm-ipam\") pod \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.287794 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-ceph\") pod \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.287883 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-inventory\") pod \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\" (UID: \"c83b88c6-39ae-4077-84b9-e10f71a53d6e\") " Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.293722 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-ceph" (OuterVolumeSpecName: "ceph") pod "c83b88c6-39ae-4077-84b9-e10f71a53d6e" (UID: "c83b88c6-39ae-4077-84b9-e10f71a53d6e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.294774 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83b88c6-39ae-4077-84b9-e10f71a53d6e-kube-api-access-vsxlk" (OuterVolumeSpecName: "kube-api-access-vsxlk") pod "c83b88c6-39ae-4077-84b9-e10f71a53d6e" (UID: "c83b88c6-39ae-4077-84b9-e10f71a53d6e"). InnerVolumeSpecName "kube-api-access-vsxlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.313898 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c83b88c6-39ae-4077-84b9-e10f71a53d6e" (UID: "c83b88c6-39ae-4077-84b9-e10f71a53d6e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.314988 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-inventory" (OuterVolumeSpecName: "inventory") pod "c83b88c6-39ae-4077-84b9-e10f71a53d6e" (UID: "c83b88c6-39ae-4077-84b9-e10f71a53d6e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.389873 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.390158 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsxlk\" (UniqueName: \"kubernetes.io/projected/c83b88c6-39ae-4077-84b9-e10f71a53d6e-kube-api-access-vsxlk\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.390272 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.390331 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c83b88c6-39ae-4077-84b9-e10f71a53d6e-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.735357 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" event={"ID":"c83b88c6-39ae-4077-84b9-e10f71a53d6e","Type":"ContainerDied","Data":"b7357492fbc6d42a45b66156cf069b369410892943e45a9b9dc56323280d89d0"} Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.735424 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7357492fbc6d42a45b66156cf069b369410892943e45a9b9dc56323280d89d0" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.735684 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vwq86" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.820469 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x"] Mar 09 09:48:09 crc kubenswrapper[4792]: E0309 09:48:09.820852 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83b88c6-39ae-4077-84b9-e10f71a53d6e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.820869 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83b88c6-39ae-4077-84b9-e10f71a53d6e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:48:09 crc kubenswrapper[4792]: E0309 09:48:09.820883 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d484cf-4b10-4b25-978f-8149fd45aa5b" containerName="oc" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.820889 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d484cf-4b10-4b25-978f-8149fd45aa5b" containerName="oc" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.821076 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="13d484cf-4b10-4b25-978f-8149fd45aa5b" containerName="oc" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.821134 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83b88c6-39ae-4077-84b9-e10f71a53d6e" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.821672 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.824021 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.825584 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.825693 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.826590 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.835556 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x"] Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.837273 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.899703 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xwl2x\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.900087 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xwl2x\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.900167 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xwl2x\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:09 crc kubenswrapper[4792]: I0309 09:48:09.900216 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grfpl\" (UniqueName: \"kubernetes.io/projected/58614884-08cd-4ea5-b45e-45a6157f16aa-kube-api-access-grfpl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xwl2x\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:10 crc kubenswrapper[4792]: I0309 09:48:10.001533 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xwl2x\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:10 crc kubenswrapper[4792]: I0309 09:48:10.001618 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grfpl\" (UniqueName: \"kubernetes.io/projected/58614884-08cd-4ea5-b45e-45a6157f16aa-kube-api-access-grfpl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xwl2x\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:10 crc kubenswrapper[4792]: I0309 09:48:10.001728 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xwl2x\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:10 crc kubenswrapper[4792]: I0309 09:48:10.001784 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xwl2x\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:10 crc kubenswrapper[4792]: I0309 09:48:10.008431 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xwl2x\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:10 crc kubenswrapper[4792]: I0309 09:48:10.009240 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xwl2x\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:10 crc kubenswrapper[4792]: I0309 09:48:10.011470 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xwl2x\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:10 crc kubenswrapper[4792]: I0309 09:48:10.023459 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grfpl\" (UniqueName: \"kubernetes.io/projected/58614884-08cd-4ea5-b45e-45a6157f16aa-kube-api-access-grfpl\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xwl2x\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:10 crc kubenswrapper[4792]: I0309 09:48:10.137841 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:10 crc kubenswrapper[4792]: I0309 09:48:10.867337 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x"] Mar 09 09:48:11 crc kubenswrapper[4792]: I0309 09:48:11.751476 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" event={"ID":"58614884-08cd-4ea5-b45e-45a6157f16aa","Type":"ContainerStarted","Data":"2dd1083e74c09e8d30c2dbea6376d147f63e10e8b6e2e4d3281294355d4646b1"} Mar 09 09:48:13 crc kubenswrapper[4792]: I0309 09:48:13.790273 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" event={"ID":"58614884-08cd-4ea5-b45e-45a6157f16aa","Type":"ContainerStarted","Data":"e37796ead1724c3bbfac84ae589e029c3c55c2f94b1af1857124064fe359e5b5"} Mar 09 09:48:13 crc kubenswrapper[4792]: I0309 09:48:13.804804 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" podStartSLOduration=3.005239074 podStartE2EDuration="4.804789001s" podCreationTimestamp="2026-03-09 09:48:09 +0000 UTC" firstStartedPulling="2026-03-09 09:48:10.873351807 +0000 UTC m=+2455.903552559" lastFinishedPulling="2026-03-09 09:48:12.672901734 +0000 UTC m=+2457.703102486" observedRunningTime="2026-03-09 09:48:13.804420343 +0000 UTC m=+2458.834621095" watchObservedRunningTime="2026-03-09 09:48:13.804789001 +0000 UTC m=+2458.834989753" Mar 09 09:48:16 crc kubenswrapper[4792]: I0309 09:48:16.662605 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:48:16 crc kubenswrapper[4792]: E0309 09:48:16.663421 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:48:29 crc kubenswrapper[4792]: I0309 09:48:29.662747 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:48:29 crc kubenswrapper[4792]: E0309 09:48:29.663416 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:48:39 crc kubenswrapper[4792]: I0309 09:48:39.806157 4792 scope.go:117] "RemoveContainer" containerID="2bbfbd384276a37a6f469736a325d0cd09b308faf21b64aff8876d3792f96374" Mar 09 09:48:43 crc kubenswrapper[4792]: I0309 09:48:43.662304 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:48:43 crc kubenswrapper[4792]: E0309 09:48:43.663313 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:48:51 crc kubenswrapper[4792]: I0309 09:48:51.057120 4792 generic.go:334] "Generic (PLEG): container finished" podID="58614884-08cd-4ea5-b45e-45a6157f16aa" containerID="e37796ead1724c3bbfac84ae589e029c3c55c2f94b1af1857124064fe359e5b5" exitCode=0 Mar 09 09:48:51 crc kubenswrapper[4792]: I0309 09:48:51.057198 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" event={"ID":"58614884-08cd-4ea5-b45e-45a6157f16aa","Type":"ContainerDied","Data":"e37796ead1724c3bbfac84ae589e029c3c55c2f94b1af1857124064fe359e5b5"} Mar 09 09:48:52 crc kubenswrapper[4792]: I0309 09:48:52.477321 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:52 crc kubenswrapper[4792]: I0309 09:48:52.588423 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grfpl\" (UniqueName: \"kubernetes.io/projected/58614884-08cd-4ea5-b45e-45a6157f16aa-kube-api-access-grfpl\") pod \"58614884-08cd-4ea5-b45e-45a6157f16aa\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " Mar 09 09:48:52 crc kubenswrapper[4792]: I0309 09:48:52.588557 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-inventory\") pod \"58614884-08cd-4ea5-b45e-45a6157f16aa\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " Mar 09 09:48:52 crc kubenswrapper[4792]: I0309 09:48:52.588601 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-ceph\") pod \"58614884-08cd-4ea5-b45e-45a6157f16aa\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " Mar 09 09:48:52 crc kubenswrapper[4792]: I0309 09:48:52.588701 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-ssh-key-openstack-edpm-ipam\") pod \"58614884-08cd-4ea5-b45e-45a6157f16aa\" (UID: \"58614884-08cd-4ea5-b45e-45a6157f16aa\") " Mar 09 09:48:52 crc kubenswrapper[4792]: I0309 09:48:52.596293 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58614884-08cd-4ea5-b45e-45a6157f16aa-kube-api-access-grfpl" (OuterVolumeSpecName: "kube-api-access-grfpl") pod "58614884-08cd-4ea5-b45e-45a6157f16aa" (UID: "58614884-08cd-4ea5-b45e-45a6157f16aa"). InnerVolumeSpecName "kube-api-access-grfpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:48:52 crc kubenswrapper[4792]: I0309 09:48:52.609358 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-ceph" (OuterVolumeSpecName: "ceph") pod "58614884-08cd-4ea5-b45e-45a6157f16aa" (UID: "58614884-08cd-4ea5-b45e-45a6157f16aa"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:48:52 crc kubenswrapper[4792]: I0309 09:48:52.615591 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-inventory" (OuterVolumeSpecName: "inventory") pod "58614884-08cd-4ea5-b45e-45a6157f16aa" (UID: "58614884-08cd-4ea5-b45e-45a6157f16aa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:48:52 crc kubenswrapper[4792]: I0309 09:48:52.626319 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "58614884-08cd-4ea5-b45e-45a6157f16aa" (UID: "58614884-08cd-4ea5-b45e-45a6157f16aa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:48:52 crc kubenswrapper[4792]: I0309 09:48:52.692235 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:52 crc kubenswrapper[4792]: I0309 09:48:52.692271 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:52 crc kubenswrapper[4792]: I0309 09:48:52.692282 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grfpl\" (UniqueName: \"kubernetes.io/projected/58614884-08cd-4ea5-b45e-45a6157f16aa-kube-api-access-grfpl\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:52 crc kubenswrapper[4792]: I0309 09:48:52.692293 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/58614884-08cd-4ea5-b45e-45a6157f16aa-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.073226 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" event={"ID":"58614884-08cd-4ea5-b45e-45a6157f16aa","Type":"ContainerDied","Data":"2dd1083e74c09e8d30c2dbea6376d147f63e10e8b6e2e4d3281294355d4646b1"} Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.073554 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dd1083e74c09e8d30c2dbea6376d147f63e10e8b6e2e4d3281294355d4646b1" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.073533 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xwl2x" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.157522 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh"] Mar 09 09:48:53 crc kubenswrapper[4792]: E0309 09:48:53.157952 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58614884-08cd-4ea5-b45e-45a6157f16aa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.157975 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="58614884-08cd-4ea5-b45e-45a6157f16aa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.158227 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="58614884-08cd-4ea5-b45e-45a6157f16aa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.159764 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.162039 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.162118 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.163188 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.164730 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.164934 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.175924 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh"] Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.303740 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbsvn\" (UniqueName: \"kubernetes.io/projected/ff236fbb-03e6-4227-b10c-9dfeac266de8-kube-api-access-hbsvn\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.303810 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.304043 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.304223 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.405965 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.406038 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbsvn\" (UniqueName: \"kubernetes.io/projected/ff236fbb-03e6-4227-b10c-9dfeac266de8-kube-api-access-hbsvn\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.406103 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.406173 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.413165 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.415728 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.416169 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.427134 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbsvn\" (UniqueName: \"kubernetes.io/projected/ff236fbb-03e6-4227-b10c-9dfeac266de8-kube-api-access-hbsvn\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:53 crc kubenswrapper[4792]: I0309 09:48:53.486188 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:48:54 crc kubenswrapper[4792]: I0309 09:48:54.000551 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh"] Mar 09 09:48:54 crc kubenswrapper[4792]: I0309 09:48:54.082363 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" event={"ID":"ff236fbb-03e6-4227-b10c-9dfeac266de8","Type":"ContainerStarted","Data":"b61f2ac3b7cd743607d5530753c11960d368a982cd7422b050e1f56918cd5918"} Mar 09 09:48:54 crc kubenswrapper[4792]: I0309 09:48:54.663105 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:48:54 crc kubenswrapper[4792]: E0309 09:48:54.663494 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:48:55 crc kubenswrapper[4792]: I0309 09:48:55.092302 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" event={"ID":"ff236fbb-03e6-4227-b10c-9dfeac266de8","Type":"ContainerStarted","Data":"a5bc045994e751b36ae9470f862eaa922077eeae74a73f9e5171a457087b35a3"} Mar 09 09:48:55 crc kubenswrapper[4792]: I0309 09:48:55.112815 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" podStartSLOduration=1.4557598889999999 podStartE2EDuration="2.112798169s" podCreationTimestamp="2026-03-09 09:48:53 +0000 UTC" firstStartedPulling="2026-03-09 09:48:53.993563477 +0000 UTC m=+2499.023764229" lastFinishedPulling="2026-03-09 09:48:54.650601757 +0000 UTC m=+2499.680802509" observedRunningTime="2026-03-09 09:48:55.110147659 +0000 UTC m=+2500.140348421" watchObservedRunningTime="2026-03-09 09:48:55.112798169 +0000 UTC m=+2500.142998931" Mar 09 09:48:59 crc kubenswrapper[4792]: I0309 09:48:59.123362 4792 generic.go:334] "Generic (PLEG): container finished" podID="ff236fbb-03e6-4227-b10c-9dfeac266de8" containerID="a5bc045994e751b36ae9470f862eaa922077eeae74a73f9e5171a457087b35a3" exitCode=0 Mar 09 09:48:59 crc kubenswrapper[4792]: I0309 09:48:59.123383 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" event={"ID":"ff236fbb-03e6-4227-b10c-9dfeac266de8","Type":"ContainerDied","Data":"a5bc045994e751b36ae9470f862eaa922077eeae74a73f9e5171a457087b35a3"} Mar 09 09:49:00 crc kubenswrapper[4792]: I0309 09:49:00.531041 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:49:00 crc kubenswrapper[4792]: I0309 09:49:00.632339 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbsvn\" (UniqueName: \"kubernetes.io/projected/ff236fbb-03e6-4227-b10c-9dfeac266de8-kube-api-access-hbsvn\") pod \"ff236fbb-03e6-4227-b10c-9dfeac266de8\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " Mar 09 09:49:00 crc kubenswrapper[4792]: I0309 09:49:00.632442 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-ceph\") pod \"ff236fbb-03e6-4227-b10c-9dfeac266de8\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " Mar 09 09:49:00 crc kubenswrapper[4792]: I0309 09:49:00.632480 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-ssh-key-openstack-edpm-ipam\") pod \"ff236fbb-03e6-4227-b10c-9dfeac266de8\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " Mar 09 09:49:00 crc kubenswrapper[4792]: I0309 09:49:00.632526 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-inventory\") pod \"ff236fbb-03e6-4227-b10c-9dfeac266de8\" (UID: \"ff236fbb-03e6-4227-b10c-9dfeac266de8\") " Mar 09 09:49:00 crc kubenswrapper[4792]: I0309 09:49:00.643992 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff236fbb-03e6-4227-b10c-9dfeac266de8-kube-api-access-hbsvn" (OuterVolumeSpecName: "kube-api-access-hbsvn") pod "ff236fbb-03e6-4227-b10c-9dfeac266de8" (UID: "ff236fbb-03e6-4227-b10c-9dfeac266de8"). InnerVolumeSpecName "kube-api-access-hbsvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:49:00 crc kubenswrapper[4792]: I0309 09:49:00.652271 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-ceph" (OuterVolumeSpecName: "ceph") pod "ff236fbb-03e6-4227-b10c-9dfeac266de8" (UID: "ff236fbb-03e6-4227-b10c-9dfeac266de8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:49:00 crc kubenswrapper[4792]: I0309 09:49:00.691222 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ff236fbb-03e6-4227-b10c-9dfeac266de8" (UID: "ff236fbb-03e6-4227-b10c-9dfeac266de8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:49:00 crc kubenswrapper[4792]: I0309 09:49:00.731427 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-inventory" (OuterVolumeSpecName: "inventory") pod "ff236fbb-03e6-4227-b10c-9dfeac266de8" (UID: "ff236fbb-03e6-4227-b10c-9dfeac266de8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:49:00 crc kubenswrapper[4792]: I0309 09:49:00.737446 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:49:00 crc kubenswrapper[4792]: I0309 09:49:00.737486 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:49:00 crc kubenswrapper[4792]: I0309 09:49:00.737499 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff236fbb-03e6-4227-b10c-9dfeac266de8-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:49:00 crc kubenswrapper[4792]: I0309 09:49:00.737510 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbsvn\" (UniqueName: \"kubernetes.io/projected/ff236fbb-03e6-4227-b10c-9dfeac266de8-kube-api-access-hbsvn\") on node \"crc\" DevicePath \"\"" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.140176 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" event={"ID":"ff236fbb-03e6-4227-b10c-9dfeac266de8","Type":"ContainerDied","Data":"b61f2ac3b7cd743607d5530753c11960d368a982cd7422b050e1f56918cd5918"} Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.140219 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b61f2ac3b7cd743607d5530753c11960d368a982cd7422b050e1f56918cd5918" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.140226 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.226027 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk"] Mar 09 09:49:01 crc kubenswrapper[4792]: E0309 09:49:01.226757 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff236fbb-03e6-4227-b10c-9dfeac266de8" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.226790 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff236fbb-03e6-4227-b10c-9dfeac266de8" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.227029 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff236fbb-03e6-4227-b10c-9dfeac266de8" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.228104 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.230998 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.231256 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.231560 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.232361 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.232596 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.251931 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk"] Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.347454 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.347697 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.348820 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qlqg\" (UniqueName: \"kubernetes.io/projected/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-kube-api-access-9qlqg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.348849 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.450514 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.450575 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.450651 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qlqg\" (UniqueName: \"kubernetes.io/projected/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-kube-api-access-9qlqg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.450680 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.456000 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.464731 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.465714 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.468634 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qlqg\" (UniqueName: \"kubernetes.io/projected/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-kube-api-access-9qlqg\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:01 crc kubenswrapper[4792]: I0309 09:49:01.545284 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:02 crc kubenswrapper[4792]: I0309 09:49:02.161648 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk"] Mar 09 09:49:03 crc kubenswrapper[4792]: I0309 09:49:03.168363 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" event={"ID":"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2","Type":"ContainerStarted","Data":"db995ab05297d13bc43b5989e2e74bcd0a3037bfa8cfcc344b81a6d3af780ba9"} Mar 09 09:49:03 crc kubenswrapper[4792]: I0309 09:49:03.169537 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" event={"ID":"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2","Type":"ContainerStarted","Data":"2a6faca0e36f9337c0a6b693c45449f3de46a5bf035059d76dc5f80f14680d57"} Mar 09 09:49:03 crc kubenswrapper[4792]: I0309 09:49:03.187719 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" podStartSLOduration=1.7041984079999999 podStartE2EDuration="2.187701031s" podCreationTimestamp="2026-03-09 09:49:01 +0000 UTC" firstStartedPulling="2026-03-09 09:49:02.176942088 +0000 UTC m=+2507.207142840" lastFinishedPulling="2026-03-09 09:49:02.660444711 +0000 UTC m=+2507.690645463" observedRunningTime="2026-03-09 09:49:03.187601789 +0000 UTC m=+2508.217802551" watchObservedRunningTime="2026-03-09 09:49:03.187701031 +0000 UTC m=+2508.217901783" Mar 09 09:49:08 crc kubenswrapper[4792]: I0309 09:49:08.663399 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:49:08 crc kubenswrapper[4792]: E0309 09:49:08.664316 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:49:23 crc kubenswrapper[4792]: I0309 09:49:23.662806 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:49:23 crc kubenswrapper[4792]: E0309 09:49:23.664639 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:49:37 crc kubenswrapper[4792]: I0309 09:49:37.662087 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:49:37 crc kubenswrapper[4792]: E0309 09:49:37.662816 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:49:43 crc kubenswrapper[4792]: I0309 09:49:43.505714 4792 generic.go:334] "Generic (PLEG): container finished" podID="1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2" containerID="db995ab05297d13bc43b5989e2e74bcd0a3037bfa8cfcc344b81a6d3af780ba9" exitCode=0 Mar 09 09:49:43 crc kubenswrapper[4792]: I0309 09:49:43.505796 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" event={"ID":"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2","Type":"ContainerDied","Data":"db995ab05297d13bc43b5989e2e74bcd0a3037bfa8cfcc344b81a6d3af780ba9"} Mar 09 09:49:44 crc kubenswrapper[4792]: I0309 09:49:44.919879 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.034966 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qlqg\" (UniqueName: \"kubernetes.io/projected/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-kube-api-access-9qlqg\") pod \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.035239 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-ssh-key-openstack-edpm-ipam\") pod \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.035341 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-ceph\") pod \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.035460 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-inventory\") pod \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\" (UID: \"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2\") " Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.046263 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-ceph" (OuterVolumeSpecName: "ceph") pod "1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2" (UID: "1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.047868 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-kube-api-access-9qlqg" (OuterVolumeSpecName: "kube-api-access-9qlqg") pod "1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2" (UID: "1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2"). InnerVolumeSpecName "kube-api-access-9qlqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.058446 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-inventory" (OuterVolumeSpecName: "inventory") pod "1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2" (UID: "1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.061594 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2" (UID: "1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.138129 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.138178 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.138200 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qlqg\" (UniqueName: \"kubernetes.io/projected/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-kube-api-access-9qlqg\") on node \"crc\" DevicePath \"\"" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.138218 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.521033 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" event={"ID":"1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2","Type":"ContainerDied","Data":"2a6faca0e36f9337c0a6b693c45449f3de46a5bf035059d76dc5f80f14680d57"} Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.521187 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a6faca0e36f9337c0a6b693c45449f3de46a5bf035059d76dc5f80f14680d57" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.521111 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.609398 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jzwbl"] Mar 09 09:49:45 crc kubenswrapper[4792]: E0309 09:49:45.609832 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.609854 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.610101 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.610864 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.616518 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.616743 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.616857 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.617157 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.617308 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.656972 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jzwbl"] Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.758963 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-ceph\") pod \"ssh-known-hosts-edpm-deployment-jzwbl\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.759471 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k256k\" (UniqueName: \"kubernetes.io/projected/48ec7142-4c09-4a5f-8202-aaf16bb97b26-kube-api-access-k256k\") pod \"ssh-known-hosts-edpm-deployment-jzwbl\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.759545 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jzwbl\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.759634 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jzwbl\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.860567 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-ceph\") pod \"ssh-known-hosts-edpm-deployment-jzwbl\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.860633 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k256k\" (UniqueName: \"kubernetes.io/projected/48ec7142-4c09-4a5f-8202-aaf16bb97b26-kube-api-access-k256k\") pod \"ssh-known-hosts-edpm-deployment-jzwbl\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.860684 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jzwbl\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.860736 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jzwbl\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.871088 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jzwbl\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.871988 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-ceph\") pod \"ssh-known-hosts-edpm-deployment-jzwbl\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.883687 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jzwbl\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.892250 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k256k\" (UniqueName: \"kubernetes.io/projected/48ec7142-4c09-4a5f-8202-aaf16bb97b26-kube-api-access-k256k\") pod \"ssh-known-hosts-edpm-deployment-jzwbl\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:45 crc kubenswrapper[4792]: I0309 09:49:45.939984 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:46 crc kubenswrapper[4792]: I0309 09:49:46.517238 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jzwbl"] Mar 09 09:49:46 crc kubenswrapper[4792]: I0309 09:49:46.533179 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" event={"ID":"48ec7142-4c09-4a5f-8202-aaf16bb97b26","Type":"ContainerStarted","Data":"ac4533fa29a3be9364a02fa7c718163e540b1c26b32011d635362871bf108e03"} Mar 09 09:49:47 crc kubenswrapper[4792]: I0309 09:49:47.545623 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" event={"ID":"48ec7142-4c09-4a5f-8202-aaf16bb97b26","Type":"ContainerStarted","Data":"07151c95b92cf4d1e9cfbd2442780f150ece9443d3d45a116983a5730edcc724"} Mar 09 09:49:47 crc kubenswrapper[4792]: I0309 09:49:47.571445 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" podStartSLOduration=2.055937636 podStartE2EDuration="2.571425501s" podCreationTimestamp="2026-03-09 09:49:45 +0000 UTC" firstStartedPulling="2026-03-09 09:49:46.523775814 +0000 UTC m=+2551.553976586" lastFinishedPulling="2026-03-09 09:49:47.039263699 +0000 UTC m=+2552.069464451" observedRunningTime="2026-03-09 09:49:47.565453875 +0000 UTC m=+2552.595654647" watchObservedRunningTime="2026-03-09 09:49:47.571425501 +0000 UTC m=+2552.601626253" Mar 09 09:49:51 crc kubenswrapper[4792]: I0309 09:49:51.664119 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:49:51 crc kubenswrapper[4792]: E0309 09:49:51.664705 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:49:56 crc kubenswrapper[4792]: I0309 09:49:56.617186 4792 generic.go:334] "Generic (PLEG): container finished" podID="48ec7142-4c09-4a5f-8202-aaf16bb97b26" containerID="07151c95b92cf4d1e9cfbd2442780f150ece9443d3d45a116983a5730edcc724" exitCode=0 Mar 09 09:49:56 crc kubenswrapper[4792]: I0309 09:49:56.617277 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" event={"ID":"48ec7142-4c09-4a5f-8202-aaf16bb97b26","Type":"ContainerDied","Data":"07151c95b92cf4d1e9cfbd2442780f150ece9443d3d45a116983a5730edcc724"} Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.020961 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.183101 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-ssh-key-openstack-edpm-ipam\") pod \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.183155 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k256k\" (UniqueName: \"kubernetes.io/projected/48ec7142-4c09-4a5f-8202-aaf16bb97b26-kube-api-access-k256k\") pod \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.183267 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-inventory-0\") pod \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.183395 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-ceph\") pod \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\" (UID: \"48ec7142-4c09-4a5f-8202-aaf16bb97b26\") " Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.190989 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ec7142-4c09-4a5f-8202-aaf16bb97b26-kube-api-access-k256k" (OuterVolumeSpecName: "kube-api-access-k256k") pod "48ec7142-4c09-4a5f-8202-aaf16bb97b26" (UID: "48ec7142-4c09-4a5f-8202-aaf16bb97b26"). InnerVolumeSpecName "kube-api-access-k256k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.197272 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-ceph" (OuterVolumeSpecName: "ceph") pod "48ec7142-4c09-4a5f-8202-aaf16bb97b26" (UID: "48ec7142-4c09-4a5f-8202-aaf16bb97b26"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.222492 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "48ec7142-4c09-4a5f-8202-aaf16bb97b26" (UID: "48ec7142-4c09-4a5f-8202-aaf16bb97b26"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.232104 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "48ec7142-4c09-4a5f-8202-aaf16bb97b26" (UID: "48ec7142-4c09-4a5f-8202-aaf16bb97b26"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.285465 4792 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.285501 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.285512 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/48ec7142-4c09-4a5f-8202-aaf16bb97b26-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.285526 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k256k\" (UniqueName: \"kubernetes.io/projected/48ec7142-4c09-4a5f-8202-aaf16bb97b26-kube-api-access-k256k\") on node \"crc\" DevicePath \"\"" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.642759 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" event={"ID":"48ec7142-4c09-4a5f-8202-aaf16bb97b26","Type":"ContainerDied","Data":"ac4533fa29a3be9364a02fa7c718163e540b1c26b32011d635362871bf108e03"} Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.643113 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac4533fa29a3be9364a02fa7c718163e540b1c26b32011d635362871bf108e03" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.643235 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jzwbl" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.737837 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6"] Mar 09 09:49:58 crc kubenswrapper[4792]: E0309 09:49:58.738337 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ec7142-4c09-4a5f-8202-aaf16bb97b26" containerName="ssh-known-hosts-edpm-deployment" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.738379 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ec7142-4c09-4a5f-8202-aaf16bb97b26" containerName="ssh-known-hosts-edpm-deployment" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.738612 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ec7142-4c09-4a5f-8202-aaf16bb97b26" containerName="ssh-known-hosts-edpm-deployment" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.739326 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.745539 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.745652 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.746129 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.746318 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.754658 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6"] Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.775681 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.904003 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-67gl6\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.904107 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-67gl6\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.904277 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn6m5\" (UniqueName: \"kubernetes.io/projected/afd25149-8416-4a5c-a84a-b63961a5e1f9-kube-api-access-qn6m5\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-67gl6\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:58 crc kubenswrapper[4792]: I0309 09:49:58.904413 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-67gl6\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:59 crc kubenswrapper[4792]: I0309 09:49:59.006541 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn6m5\" (UniqueName: \"kubernetes.io/projected/afd25149-8416-4a5c-a84a-b63961a5e1f9-kube-api-access-qn6m5\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-67gl6\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:59 crc kubenswrapper[4792]: I0309 09:49:59.006613 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-67gl6\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:59 crc kubenswrapper[4792]: I0309 09:49:59.006781 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-67gl6\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:59 crc kubenswrapper[4792]: I0309 09:49:59.006835 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-67gl6\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:59 crc kubenswrapper[4792]: I0309 09:49:59.011273 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-67gl6\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:59 crc kubenswrapper[4792]: I0309 09:49:59.011638 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-67gl6\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:59 crc kubenswrapper[4792]: I0309 09:49:59.017365 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-67gl6\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:59 crc kubenswrapper[4792]: I0309 09:49:59.025594 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn6m5\" (UniqueName: \"kubernetes.io/projected/afd25149-8416-4a5c-a84a-b63961a5e1f9-kube-api-access-qn6m5\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-67gl6\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:59 crc kubenswrapper[4792]: I0309 09:49:59.089435 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:49:59 crc kubenswrapper[4792]: I0309 09:49:59.611583 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6"] Mar 09 09:49:59 crc kubenswrapper[4792]: I0309 09:49:59.651450 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" event={"ID":"afd25149-8416-4a5c-a84a-b63961a5e1f9","Type":"ContainerStarted","Data":"8e734a51983c661ad7ee9d3ce1ccfc03523ab138824279f9b9e0911e61af4b12"} Mar 09 09:50:00 crc kubenswrapper[4792]: I0309 09:50:00.146152 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550830-jwzpt"] Mar 09 09:50:00 crc kubenswrapper[4792]: I0309 09:50:00.150026 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550830-jwzpt" Mar 09 09:50:00 crc kubenswrapper[4792]: I0309 09:50:00.153988 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:50:00 crc kubenswrapper[4792]: I0309 09:50:00.154307 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:50:00 crc kubenswrapper[4792]: I0309 09:50:00.154396 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:50:00 crc kubenswrapper[4792]: I0309 09:50:00.164337 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550830-jwzpt"] Mar 09 09:50:00 crc kubenswrapper[4792]: I0309 09:50:00.230845 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svw9g\" (UniqueName: \"kubernetes.io/projected/e64c4eca-1bc2-4122-b966-cf8709fdf457-kube-api-access-svw9g\") pod \"auto-csr-approver-29550830-jwzpt\" (UID: \"e64c4eca-1bc2-4122-b966-cf8709fdf457\") " pod="openshift-infra/auto-csr-approver-29550830-jwzpt" Mar 09 09:50:00 crc kubenswrapper[4792]: I0309 09:50:00.333419 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svw9g\" (UniqueName: \"kubernetes.io/projected/e64c4eca-1bc2-4122-b966-cf8709fdf457-kube-api-access-svw9g\") pod \"auto-csr-approver-29550830-jwzpt\" (UID: \"e64c4eca-1bc2-4122-b966-cf8709fdf457\") " pod="openshift-infra/auto-csr-approver-29550830-jwzpt" Mar 09 09:50:00 crc kubenswrapper[4792]: I0309 09:50:00.355316 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svw9g\" (UniqueName: \"kubernetes.io/projected/e64c4eca-1bc2-4122-b966-cf8709fdf457-kube-api-access-svw9g\") pod \"auto-csr-approver-29550830-jwzpt\" (UID: \"e64c4eca-1bc2-4122-b966-cf8709fdf457\") " pod="openshift-infra/auto-csr-approver-29550830-jwzpt" Mar 09 09:50:00 crc kubenswrapper[4792]: I0309 09:50:00.592407 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550830-jwzpt" Mar 09 09:50:00 crc kubenswrapper[4792]: I0309 09:50:00.670840 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" event={"ID":"afd25149-8416-4a5c-a84a-b63961a5e1f9","Type":"ContainerStarted","Data":"df1c7f37f523ce99b5f538b840c4b4bf3c6dce6e278d9b1b596218afedb23f31"} Mar 09 09:50:00 crc kubenswrapper[4792]: I0309 09:50:00.699742 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" podStartSLOduration=2.24862363 podStartE2EDuration="2.699725031s" podCreationTimestamp="2026-03-09 09:49:58 +0000 UTC" firstStartedPulling="2026-03-09 09:49:59.620455871 +0000 UTC m=+2564.650656633" lastFinishedPulling="2026-03-09 09:50:00.071557282 +0000 UTC m=+2565.101758034" observedRunningTime="2026-03-09 09:50:00.696222051 +0000 UTC m=+2565.726422803" watchObservedRunningTime="2026-03-09 09:50:00.699725031 +0000 UTC m=+2565.729925783" Mar 09 09:50:01 crc kubenswrapper[4792]: I0309 09:50:01.065540 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550830-jwzpt"] Mar 09 09:50:01 crc kubenswrapper[4792]: I0309 09:50:01.680969 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550830-jwzpt" event={"ID":"e64c4eca-1bc2-4122-b966-cf8709fdf457","Type":"ContainerStarted","Data":"3bc2243198c9f177c64ad13519e3ba2d02fa8bcdd50710f0cae7f36193281e46"} Mar 09 09:50:02 crc kubenswrapper[4792]: I0309 09:50:02.662620 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:50:02 crc kubenswrapper[4792]: E0309 09:50:02.663345 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:50:02 crc kubenswrapper[4792]: I0309 09:50:02.703021 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550830-jwzpt" event={"ID":"e64c4eca-1bc2-4122-b966-cf8709fdf457","Type":"ContainerStarted","Data":"34b65862816b457534aeb66a742e8c75aa34907a5ac5adf505d2e248bbfa9f87"} Mar 09 09:50:02 crc kubenswrapper[4792]: I0309 09:50:02.730576 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550830-jwzpt" podStartSLOduration=1.447367265 podStartE2EDuration="2.730539616s" podCreationTimestamp="2026-03-09 09:50:00 +0000 UTC" firstStartedPulling="2026-03-09 09:50:01.059560942 +0000 UTC m=+2566.089761694" lastFinishedPulling="2026-03-09 09:50:02.342733303 +0000 UTC m=+2567.372934045" observedRunningTime="2026-03-09 09:50:02.723754974 +0000 UTC m=+2567.753955726" watchObservedRunningTime="2026-03-09 09:50:02.730539616 +0000 UTC m=+2567.760740368" Mar 09 09:50:03 crc kubenswrapper[4792]: I0309 09:50:03.715193 4792 generic.go:334] "Generic (PLEG): container finished" podID="e64c4eca-1bc2-4122-b966-cf8709fdf457" containerID="34b65862816b457534aeb66a742e8c75aa34907a5ac5adf505d2e248bbfa9f87" exitCode=0 Mar 09 09:50:03 crc kubenswrapper[4792]: I0309 09:50:03.715346 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550830-jwzpt" event={"ID":"e64c4eca-1bc2-4122-b966-cf8709fdf457","Type":"ContainerDied","Data":"34b65862816b457534aeb66a742e8c75aa34907a5ac5adf505d2e248bbfa9f87"} Mar 09 09:50:05 crc kubenswrapper[4792]: I0309 09:50:05.077027 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550830-jwzpt" Mar 09 09:50:05 crc kubenswrapper[4792]: I0309 09:50:05.227776 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svw9g\" (UniqueName: \"kubernetes.io/projected/e64c4eca-1bc2-4122-b966-cf8709fdf457-kube-api-access-svw9g\") pod \"e64c4eca-1bc2-4122-b966-cf8709fdf457\" (UID: \"e64c4eca-1bc2-4122-b966-cf8709fdf457\") " Mar 09 09:50:05 crc kubenswrapper[4792]: I0309 09:50:05.234455 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e64c4eca-1bc2-4122-b966-cf8709fdf457-kube-api-access-svw9g" (OuterVolumeSpecName: "kube-api-access-svw9g") pod "e64c4eca-1bc2-4122-b966-cf8709fdf457" (UID: "e64c4eca-1bc2-4122-b966-cf8709fdf457"). InnerVolumeSpecName "kube-api-access-svw9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:05 crc kubenswrapper[4792]: I0309 09:50:05.330519 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svw9g\" (UniqueName: \"kubernetes.io/projected/e64c4eca-1bc2-4122-b966-cf8709fdf457-kube-api-access-svw9g\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:05 crc kubenswrapper[4792]: I0309 09:50:05.735398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550830-jwzpt" event={"ID":"e64c4eca-1bc2-4122-b966-cf8709fdf457","Type":"ContainerDied","Data":"3bc2243198c9f177c64ad13519e3ba2d02fa8bcdd50710f0cae7f36193281e46"} Mar 09 09:50:05 crc kubenswrapper[4792]: I0309 09:50:05.735712 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bc2243198c9f177c64ad13519e3ba2d02fa8bcdd50710f0cae7f36193281e46" Mar 09 09:50:05 crc kubenswrapper[4792]: I0309 09:50:05.735459 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550830-jwzpt" Mar 09 09:50:05 crc kubenswrapper[4792]: I0309 09:50:05.804884 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550824-wdgmk"] Mar 09 09:50:05 crc kubenswrapper[4792]: I0309 09:50:05.812204 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550824-wdgmk"] Mar 09 09:50:07 crc kubenswrapper[4792]: I0309 09:50:07.677576 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0cb70dd-e163-45b2-958a-6976abab327b" path="/var/lib/kubelet/pods/e0cb70dd-e163-45b2-958a-6976abab327b/volumes" Mar 09 09:50:07 crc kubenswrapper[4792]: I0309 09:50:07.758020 4792 generic.go:334] "Generic (PLEG): container finished" podID="afd25149-8416-4a5c-a84a-b63961a5e1f9" containerID="df1c7f37f523ce99b5f538b840c4b4bf3c6dce6e278d9b1b596218afedb23f31" exitCode=0 Mar 09 09:50:07 crc kubenswrapper[4792]: I0309 09:50:07.758090 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" event={"ID":"afd25149-8416-4a5c-a84a-b63961a5e1f9","Type":"ContainerDied","Data":"df1c7f37f523ce99b5f538b840c4b4bf3c6dce6e278d9b1b596218afedb23f31"} Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.188921 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.296887 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn6m5\" (UniqueName: \"kubernetes.io/projected/afd25149-8416-4a5c-a84a-b63961a5e1f9-kube-api-access-qn6m5\") pod \"afd25149-8416-4a5c-a84a-b63961a5e1f9\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.297010 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-inventory\") pod \"afd25149-8416-4a5c-a84a-b63961a5e1f9\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.297038 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-ssh-key-openstack-edpm-ipam\") pod \"afd25149-8416-4a5c-a84a-b63961a5e1f9\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.297115 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-ceph\") pod \"afd25149-8416-4a5c-a84a-b63961a5e1f9\" (UID: \"afd25149-8416-4a5c-a84a-b63961a5e1f9\") " Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.304190 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-ceph" (OuterVolumeSpecName: "ceph") pod "afd25149-8416-4a5c-a84a-b63961a5e1f9" (UID: "afd25149-8416-4a5c-a84a-b63961a5e1f9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.304784 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afd25149-8416-4a5c-a84a-b63961a5e1f9-kube-api-access-qn6m5" (OuterVolumeSpecName: "kube-api-access-qn6m5") pod "afd25149-8416-4a5c-a84a-b63961a5e1f9" (UID: "afd25149-8416-4a5c-a84a-b63961a5e1f9"). InnerVolumeSpecName "kube-api-access-qn6m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.326232 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "afd25149-8416-4a5c-a84a-b63961a5e1f9" (UID: "afd25149-8416-4a5c-a84a-b63961a5e1f9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.330715 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-inventory" (OuterVolumeSpecName: "inventory") pod "afd25149-8416-4a5c-a84a-b63961a5e1f9" (UID: "afd25149-8416-4a5c-a84a-b63961a5e1f9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.399248 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.399280 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.399290 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/afd25149-8416-4a5c-a84a-b63961a5e1f9-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.399299 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn6m5\" (UniqueName: \"kubernetes.io/projected/afd25149-8416-4a5c-a84a-b63961a5e1f9-kube-api-access-qn6m5\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.779765 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" event={"ID":"afd25149-8416-4a5c-a84a-b63961a5e1f9","Type":"ContainerDied","Data":"8e734a51983c661ad7ee9d3ce1ccfc03523ab138824279f9b9e0911e61af4b12"} Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.779806 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-67gl6" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.779808 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e734a51983c661ad7ee9d3ce1ccfc03523ab138824279f9b9e0911e61af4b12" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.846060 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5"] Mar 09 09:50:09 crc kubenswrapper[4792]: E0309 09:50:09.846548 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64c4eca-1bc2-4122-b966-cf8709fdf457" containerName="oc" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.846572 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64c4eca-1bc2-4122-b966-cf8709fdf457" containerName="oc" Mar 09 09:50:09 crc kubenswrapper[4792]: E0309 09:50:09.846615 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd25149-8416-4a5c-a84a-b63961a5e1f9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.846625 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd25149-8416-4a5c-a84a-b63961a5e1f9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.846842 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="afd25149-8416-4a5c-a84a-b63961a5e1f9" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.846863 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64c4eca-1bc2-4122-b966-cf8709fdf457" containerName="oc" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.848338 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.851185 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.851185 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.851194 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.851680 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.852188 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.856904 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5"] Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.910470 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.910526 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.910563 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:09 crc kubenswrapper[4792]: I0309 09:50:09.910640 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z67fc\" (UniqueName: \"kubernetes.io/projected/f84f1271-7155-48fc-a6f0-d1777cb75ac5-kube-api-access-z67fc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:10 crc kubenswrapper[4792]: I0309 09:50:10.011707 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:10 crc kubenswrapper[4792]: I0309 09:50:10.011824 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z67fc\" (UniqueName: \"kubernetes.io/projected/f84f1271-7155-48fc-a6f0-d1777cb75ac5-kube-api-access-z67fc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:10 crc kubenswrapper[4792]: I0309 09:50:10.011937 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:10 crc kubenswrapper[4792]: I0309 09:50:10.011986 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:10 crc kubenswrapper[4792]: I0309 09:50:10.016707 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:10 crc kubenswrapper[4792]: I0309 09:50:10.019638 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:10 crc kubenswrapper[4792]: I0309 09:50:10.021723 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:10 crc kubenswrapper[4792]: I0309 09:50:10.032713 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z67fc\" (UniqueName: \"kubernetes.io/projected/f84f1271-7155-48fc-a6f0-d1777cb75ac5-kube-api-access-z67fc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:10 crc kubenswrapper[4792]: I0309 09:50:10.164432 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:10 crc kubenswrapper[4792]: I0309 09:50:10.753315 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5"] Mar 09 09:50:10 crc kubenswrapper[4792]: W0309 09:50:10.758372 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf84f1271_7155_48fc_a6f0_d1777cb75ac5.slice/crio-5c2ef62ac5cdc26531c4e260d90fcd8b8154ae738aa0f72e2cff216ebf07ca93 WatchSource:0}: Error finding container 5c2ef62ac5cdc26531c4e260d90fcd8b8154ae738aa0f72e2cff216ebf07ca93: Status 404 returned error can't find the container with id 5c2ef62ac5cdc26531c4e260d90fcd8b8154ae738aa0f72e2cff216ebf07ca93 Mar 09 09:50:10 crc kubenswrapper[4792]: I0309 09:50:10.790422 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" event={"ID":"f84f1271-7155-48fc-a6f0-d1777cb75ac5","Type":"ContainerStarted","Data":"5c2ef62ac5cdc26531c4e260d90fcd8b8154ae738aa0f72e2cff216ebf07ca93"} Mar 09 09:50:11 crc kubenswrapper[4792]: I0309 09:50:11.807959 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" event={"ID":"f84f1271-7155-48fc-a6f0-d1777cb75ac5","Type":"ContainerStarted","Data":"1f9ae3fe5083c02880ead317c6a491b71f08dc5a4f119619164127fd4093d98a"} Mar 09 09:50:11 crc kubenswrapper[4792]: I0309 09:50:11.828276 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" podStartSLOduration=2.34377665 podStartE2EDuration="2.828260335s" podCreationTimestamp="2026-03-09 09:50:09 +0000 UTC" firstStartedPulling="2026-03-09 09:50:10.76027869 +0000 UTC m=+2575.790479442" lastFinishedPulling="2026-03-09 09:50:11.244762375 +0000 UTC m=+2576.274963127" observedRunningTime="2026-03-09 09:50:11.82583633 +0000 UTC m=+2576.856037082" watchObservedRunningTime="2026-03-09 09:50:11.828260335 +0000 UTC m=+2576.858461087" Mar 09 09:50:13 crc kubenswrapper[4792]: I0309 09:50:13.664485 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:50:14 crc kubenswrapper[4792]: I0309 09:50:14.832644 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"862129b30016327d77e586feefb2263203c7afdd585cf1571fd31502117efa31"} Mar 09 09:50:20 crc kubenswrapper[4792]: I0309 09:50:20.879153 4792 generic.go:334] "Generic (PLEG): container finished" podID="f84f1271-7155-48fc-a6f0-d1777cb75ac5" containerID="1f9ae3fe5083c02880ead317c6a491b71f08dc5a4f119619164127fd4093d98a" exitCode=0 Mar 09 09:50:20 crc kubenswrapper[4792]: I0309 09:50:20.879226 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" event={"ID":"f84f1271-7155-48fc-a6f0-d1777cb75ac5","Type":"ContainerDied","Data":"1f9ae3fe5083c02880ead317c6a491b71f08dc5a4f119619164127fd4093d98a"} Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.294293 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.357468 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-ssh-key-openstack-edpm-ipam\") pod \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.357615 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-ceph\") pod \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.357768 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z67fc\" (UniqueName: \"kubernetes.io/projected/f84f1271-7155-48fc-a6f0-d1777cb75ac5-kube-api-access-z67fc\") pod \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.358378 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-inventory\") pod \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\" (UID: \"f84f1271-7155-48fc-a6f0-d1777cb75ac5\") " Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.364453 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f84f1271-7155-48fc-a6f0-d1777cb75ac5-kube-api-access-z67fc" (OuterVolumeSpecName: "kube-api-access-z67fc") pod "f84f1271-7155-48fc-a6f0-d1777cb75ac5" (UID: "f84f1271-7155-48fc-a6f0-d1777cb75ac5"). InnerVolumeSpecName "kube-api-access-z67fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.364453 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-ceph" (OuterVolumeSpecName: "ceph") pod "f84f1271-7155-48fc-a6f0-d1777cb75ac5" (UID: "f84f1271-7155-48fc-a6f0-d1777cb75ac5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.383038 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-inventory" (OuterVolumeSpecName: "inventory") pod "f84f1271-7155-48fc-a6f0-d1777cb75ac5" (UID: "f84f1271-7155-48fc-a6f0-d1777cb75ac5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.388790 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f84f1271-7155-48fc-a6f0-d1777cb75ac5" (UID: "f84f1271-7155-48fc-a6f0-d1777cb75ac5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.461166 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.461197 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.461208 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f84f1271-7155-48fc-a6f0-d1777cb75ac5-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.461217 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z67fc\" (UniqueName: \"kubernetes.io/projected/f84f1271-7155-48fc-a6f0-d1777cb75ac5-kube-api-access-z67fc\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.898149 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" event={"ID":"f84f1271-7155-48fc-a6f0-d1777cb75ac5","Type":"ContainerDied","Data":"5c2ef62ac5cdc26531c4e260d90fcd8b8154ae738aa0f72e2cff216ebf07ca93"} Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.898192 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c2ef62ac5cdc26531c4e260d90fcd8b8154ae738aa0f72e2cff216ebf07ca93" Mar 09 09:50:22 crc kubenswrapper[4792]: I0309 09:50:22.898307 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.046446 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq"] Mar 09 09:50:23 crc kubenswrapper[4792]: E0309 09:50:23.047681 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84f1271-7155-48fc-a6f0-d1777cb75ac5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.047709 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84f1271-7155-48fc-a6f0-d1777cb75ac5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.048044 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f84f1271-7155-48fc-a6f0-d1777cb75ac5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.049169 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.052789 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.053025 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.053059 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.053187 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.053400 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.055719 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.055914 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.057455 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.082799 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq"] Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.098919 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.099012 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.099112 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.099149 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.099251 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.099368 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.099398 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.099439 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm2sg\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-kube-api-access-qm2sg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.099474 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.099509 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.099586 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.099644 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.099679 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.203415 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.203770 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.203829 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm2sg\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-kube-api-access-qm2sg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.203867 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.203909 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.203997 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.204070 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.204134 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.204259 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.204307 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.204394 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.204436 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.204478 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.212591 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.212624 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.212734 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.212939 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.215453 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.215774 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.216233 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.216677 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.217015 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.221634 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.221876 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.225409 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm2sg\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-kube-api-access-qm2sg\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.225560 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.384604 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.891184 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq"] Mar 09 09:50:23 crc kubenswrapper[4792]: I0309 09:50:23.908599 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" event={"ID":"9ff0f9ff-023a-4679-a084-1d4ae368e02d","Type":"ContainerStarted","Data":"4062771c21233b47ddfa29982155724f0d61751eb588232ad4a657eb0a2fe38f"} Mar 09 09:50:24 crc kubenswrapper[4792]: I0309 09:50:24.919352 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" event={"ID":"9ff0f9ff-023a-4679-a084-1d4ae368e02d","Type":"ContainerStarted","Data":"9143663ee641988b11eaec0b150a8a8220d28e8d57dbf35f32cbe783136a254b"} Mar 09 09:50:24 crc kubenswrapper[4792]: I0309 09:50:24.936269 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" podStartSLOduration=2.563912309 podStartE2EDuration="2.936251079s" podCreationTimestamp="2026-03-09 09:50:22 +0000 UTC" firstStartedPulling="2026-03-09 09:50:23.899275727 +0000 UTC m=+2588.929476479" lastFinishedPulling="2026-03-09 09:50:24.271614496 +0000 UTC m=+2589.301815249" observedRunningTime="2026-03-09 09:50:24.932768984 +0000 UTC m=+2589.962969746" watchObservedRunningTime="2026-03-09 09:50:24.936251079 +0000 UTC m=+2589.966451841" Mar 09 09:50:39 crc kubenswrapper[4792]: I0309 09:50:39.884804 4792 scope.go:117] "RemoveContainer" containerID="1076a39049e092b567394b27ac970575a352fa705c4b4b1f5ff883dc912e1025" Mar 09 09:50:52 crc kubenswrapper[4792]: I0309 09:50:52.190265 4792 generic.go:334] "Generic (PLEG): container finished" podID="9ff0f9ff-023a-4679-a084-1d4ae368e02d" containerID="9143663ee641988b11eaec0b150a8a8220d28e8d57dbf35f32cbe783136a254b" exitCode=0 Mar 09 09:50:52 crc kubenswrapper[4792]: I0309 09:50:52.190363 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" event={"ID":"9ff0f9ff-023a-4679-a084-1d4ae368e02d","Type":"ContainerDied","Data":"9143663ee641988b11eaec0b150a8a8220d28e8d57dbf35f32cbe783136a254b"} Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.612391 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.694953 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-nova-combined-ca-bundle\") pod \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.695596 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.695731 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-libvirt-combined-ca-bundle\") pod \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.695876 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.696019 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-repo-setup-combined-ca-bundle\") pod \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.696148 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm2sg\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-kube-api-access-qm2sg\") pod \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.696636 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ovn-combined-ca-bundle\") pod \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.696727 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ceph\") pod \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.696828 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ssh-key-openstack-edpm-ipam\") pod \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.696979 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-neutron-metadata-combined-ca-bundle\") pod \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.697089 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.697213 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-bootstrap-combined-ca-bundle\") pod \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.697348 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-inventory\") pod \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\" (UID: \"9ff0f9ff-023a-4679-a084-1d4ae368e02d\") " Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.702575 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "9ff0f9ff-023a-4679-a084-1d4ae368e02d" (UID: "9ff0f9ff-023a-4679-a084-1d4ae368e02d"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.703005 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ceph" (OuterVolumeSpecName: "ceph") pod "9ff0f9ff-023a-4679-a084-1d4ae368e02d" (UID: "9ff0f9ff-023a-4679-a084-1d4ae368e02d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.703310 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "9ff0f9ff-023a-4679-a084-1d4ae368e02d" (UID: "9ff0f9ff-023a-4679-a084-1d4ae368e02d"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.703512 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "9ff0f9ff-023a-4679-a084-1d4ae368e02d" (UID: "9ff0f9ff-023a-4679-a084-1d4ae368e02d"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.703945 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9ff0f9ff-023a-4679-a084-1d4ae368e02d" (UID: "9ff0f9ff-023a-4679-a084-1d4ae368e02d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.705720 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9ff0f9ff-023a-4679-a084-1d4ae368e02d" (UID: "9ff0f9ff-023a-4679-a084-1d4ae368e02d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.705823 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9ff0f9ff-023a-4679-a084-1d4ae368e02d" (UID: "9ff0f9ff-023a-4679-a084-1d4ae368e02d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.705947 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9ff0f9ff-023a-4679-a084-1d4ae368e02d" (UID: "9ff0f9ff-023a-4679-a084-1d4ae368e02d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.706402 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-kube-api-access-qm2sg" (OuterVolumeSpecName: "kube-api-access-qm2sg") pod "9ff0f9ff-023a-4679-a084-1d4ae368e02d" (UID: "9ff0f9ff-023a-4679-a084-1d4ae368e02d"). InnerVolumeSpecName "kube-api-access-qm2sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.706835 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9ff0f9ff-023a-4679-a084-1d4ae368e02d" (UID: "9ff0f9ff-023a-4679-a084-1d4ae368e02d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.707140 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9ff0f9ff-023a-4679-a084-1d4ae368e02d" (UID: "9ff0f9ff-023a-4679-a084-1d4ae368e02d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.727396 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9ff0f9ff-023a-4679-a084-1d4ae368e02d" (UID: "9ff0f9ff-023a-4679-a084-1d4ae368e02d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.729482 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-inventory" (OuterVolumeSpecName: "inventory") pod "9ff0f9ff-023a-4679-a084-1d4ae368e02d" (UID: "9ff0f9ff-023a-4679-a084-1d4ae368e02d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.800624 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.800674 4792 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.800688 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.800700 4792 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.800714 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.800726 4792 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.800736 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm2sg\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-kube-api-access-qm2sg\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.800750 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.800762 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.800773 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.800782 4792 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.800793 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9ff0f9ff-023a-4679-a084-1d4ae368e02d-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:53 crc kubenswrapper[4792]: I0309 09:50:53.800804 4792 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff0f9ff-023a-4679-a084-1d4ae368e02d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.206877 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" event={"ID":"9ff0f9ff-023a-4679-a084-1d4ae368e02d","Type":"ContainerDied","Data":"4062771c21233b47ddfa29982155724f0d61751eb588232ad4a657eb0a2fe38f"} Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.206919 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4062771c21233b47ddfa29982155724f0d61751eb588232ad4a657eb0a2fe38f" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.206972 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.306721 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm"] Mar 09 09:50:54 crc kubenswrapper[4792]: E0309 09:50:54.307149 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff0f9ff-023a-4679-a084-1d4ae368e02d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.307169 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff0f9ff-023a-4679-a084-1d4ae368e02d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.307376 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff0f9ff-023a-4679-a084-1d4ae368e02d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.308040 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.310445 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.310505 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.311379 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.311870 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.312505 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.326644 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm"] Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.410620 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvrf\" (UniqueName: \"kubernetes.io/projected/e8ade593-bf65-47e0-8be9-76c8fedc40a1-kube-api-access-wkvrf\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.410702 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.410748 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.410766 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.512388 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvrf\" (UniqueName: \"kubernetes.io/projected/e8ade593-bf65-47e0-8be9-76c8fedc40a1-kube-api-access-wkvrf\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.512484 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.512550 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.512574 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.533961 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.533981 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.534466 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.539159 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvrf\" (UniqueName: \"kubernetes.io/projected/e8ade593-bf65-47e0-8be9-76c8fedc40a1-kube-api-access-wkvrf\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:54 crc kubenswrapper[4792]: I0309 09:50:54.630301 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:50:55 crc kubenswrapper[4792]: I0309 09:50:55.255009 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm"] Mar 09 09:50:56 crc kubenswrapper[4792]: I0309 09:50:56.222258 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" event={"ID":"e8ade593-bf65-47e0-8be9-76c8fedc40a1","Type":"ContainerStarted","Data":"443bd41bcd97c2be1e145e464d6644236856e7c876d2f5e0b00ef88b73a8d971"} Mar 09 09:50:56 crc kubenswrapper[4792]: I0309 09:50:56.222823 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" event={"ID":"e8ade593-bf65-47e0-8be9-76c8fedc40a1","Type":"ContainerStarted","Data":"e2b85616ab5060edbc450a1c1799bdcd06793b84d01e2d9f3889f0aa41b66052"} Mar 09 09:50:56 crc kubenswrapper[4792]: I0309 09:50:56.239444 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" podStartSLOduration=1.839644445 podStartE2EDuration="2.239429814s" podCreationTimestamp="2026-03-09 09:50:54 +0000 UTC" firstStartedPulling="2026-03-09 09:50:55.258083262 +0000 UTC m=+2620.288284014" lastFinishedPulling="2026-03-09 09:50:55.657868631 +0000 UTC m=+2620.688069383" observedRunningTime="2026-03-09 09:50:56.238761386 +0000 UTC m=+2621.268962138" watchObservedRunningTime="2026-03-09 09:50:56.239429814 +0000 UTC m=+2621.269630566" Mar 09 09:51:01 crc kubenswrapper[4792]: I0309 09:51:01.261721 4792 generic.go:334] "Generic (PLEG): container finished" podID="e8ade593-bf65-47e0-8be9-76c8fedc40a1" containerID="443bd41bcd97c2be1e145e464d6644236856e7c876d2f5e0b00ef88b73a8d971" exitCode=0 Mar 09 09:51:01 crc kubenswrapper[4792]: I0309 09:51:01.261796 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" event={"ID":"e8ade593-bf65-47e0-8be9-76c8fedc40a1","Type":"ContainerDied","Data":"443bd41bcd97c2be1e145e464d6644236856e7c876d2f5e0b00ef88b73a8d971"} Mar 09 09:51:02 crc kubenswrapper[4792]: I0309 09:51:02.772722 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:51:02 crc kubenswrapper[4792]: I0309 09:51:02.878378 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-inventory\") pod \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " Mar 09 09:51:02 crc kubenswrapper[4792]: I0309 09:51:02.878559 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-ceph\") pod \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " Mar 09 09:51:02 crc kubenswrapper[4792]: I0309 09:51:02.878828 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkvrf\" (UniqueName: \"kubernetes.io/projected/e8ade593-bf65-47e0-8be9-76c8fedc40a1-kube-api-access-wkvrf\") pod \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " Mar 09 09:51:02 crc kubenswrapper[4792]: I0309 09:51:02.878895 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-ssh-key-openstack-edpm-ipam\") pod \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\" (UID: \"e8ade593-bf65-47e0-8be9-76c8fedc40a1\") " Mar 09 09:51:02 crc kubenswrapper[4792]: I0309 09:51:02.886895 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ade593-bf65-47e0-8be9-76c8fedc40a1-kube-api-access-wkvrf" (OuterVolumeSpecName: "kube-api-access-wkvrf") pod "e8ade593-bf65-47e0-8be9-76c8fedc40a1" (UID: "e8ade593-bf65-47e0-8be9-76c8fedc40a1"). InnerVolumeSpecName "kube-api-access-wkvrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:02 crc kubenswrapper[4792]: I0309 09:51:02.888334 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-ceph" (OuterVolumeSpecName: "ceph") pod "e8ade593-bf65-47e0-8be9-76c8fedc40a1" (UID: "e8ade593-bf65-47e0-8be9-76c8fedc40a1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:02 crc kubenswrapper[4792]: I0309 09:51:02.911487 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e8ade593-bf65-47e0-8be9-76c8fedc40a1" (UID: "e8ade593-bf65-47e0-8be9-76c8fedc40a1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:02 crc kubenswrapper[4792]: I0309 09:51:02.913676 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-inventory" (OuterVolumeSpecName: "inventory") pod "e8ade593-bf65-47e0-8be9-76c8fedc40a1" (UID: "e8ade593-bf65-47e0-8be9-76c8fedc40a1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:51:02 crc kubenswrapper[4792]: I0309 09:51:02.981864 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:02 crc kubenswrapper[4792]: I0309 09:51:02.982209 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkvrf\" (UniqueName: \"kubernetes.io/projected/e8ade593-bf65-47e0-8be9-76c8fedc40a1-kube-api-access-wkvrf\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:02 crc kubenswrapper[4792]: I0309 09:51:02.982228 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:02 crc kubenswrapper[4792]: I0309 09:51:02.982238 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8ade593-bf65-47e0-8be9-76c8fedc40a1-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.279899 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" event={"ID":"e8ade593-bf65-47e0-8be9-76c8fedc40a1","Type":"ContainerDied","Data":"e2b85616ab5060edbc450a1c1799bdcd06793b84d01e2d9f3889f0aa41b66052"} Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.279948 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2b85616ab5060edbc450a1c1799bdcd06793b84d01e2d9f3889f0aa41b66052" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.280011 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.393072 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh"] Mar 09 09:51:03 crc kubenswrapper[4792]: E0309 09:51:03.393512 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ade593-bf65-47e0-8be9-76c8fedc40a1" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.393533 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ade593-bf65-47e0-8be9-76c8fedc40a1" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.393748 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ade593-bf65-47e0-8be9-76c8fedc40a1" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.394434 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.397556 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.397712 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.398267 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.398500 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.399909 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.401757 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.414918 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh"] Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.491795 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.492163 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.492481 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c5a32778-1a93-440b-9f56-d0bded50a725-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.492697 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.492859 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzhk9\" (UniqueName: \"kubernetes.io/projected/c5a32778-1a93-440b-9f56-d0bded50a725-kube-api-access-nzhk9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.493205 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.594989 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzhk9\" (UniqueName: \"kubernetes.io/projected/c5a32778-1a93-440b-9f56-d0bded50a725-kube-api-access-nzhk9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.595392 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.595419 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.595453 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.595499 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c5a32778-1a93-440b-9f56-d0bded50a725-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.595533 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.597376 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c5a32778-1a93-440b-9f56-d0bded50a725-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.601553 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.601738 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.601938 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.602653 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.618014 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzhk9\" (UniqueName: \"kubernetes.io/projected/c5a32778-1a93-440b-9f56-d0bded50a725-kube-api-access-nzhk9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7c2rh\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:03 crc kubenswrapper[4792]: I0309 09:51:03.717501 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:51:04 crc kubenswrapper[4792]: I0309 09:51:04.461852 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh"] Mar 09 09:51:05 crc kubenswrapper[4792]: I0309 09:51:05.299224 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" event={"ID":"c5a32778-1a93-440b-9f56-d0bded50a725","Type":"ContainerStarted","Data":"5bdbf5e95c7da63dcf87670d059c7238af157fe480689c0200eefb49df90998a"} Mar 09 09:51:06 crc kubenswrapper[4792]: I0309 09:51:06.314618 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" event={"ID":"c5a32778-1a93-440b-9f56-d0bded50a725","Type":"ContainerStarted","Data":"b1e8a3d51d053da599390e2b780e3ca312329877273b78bb5263562584788469"} Mar 09 09:51:36 crc kubenswrapper[4792]: I0309 09:51:36.934788 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" podStartSLOduration=32.799807595 podStartE2EDuration="33.934768191s" podCreationTimestamp="2026-03-09 09:51:03 +0000 UTC" firstStartedPulling="2026-03-09 09:51:04.474682736 +0000 UTC m=+2629.504883488" lastFinishedPulling="2026-03-09 09:51:05.609643342 +0000 UTC m=+2630.639844084" observedRunningTime="2026-03-09 09:51:06.334388335 +0000 UTC m=+2631.364589097" watchObservedRunningTime="2026-03-09 09:51:36.934768191 +0000 UTC m=+2661.964968943" Mar 09 09:51:36 crc kubenswrapper[4792]: I0309 09:51:36.939547 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qbtrp"] Mar 09 09:51:36 crc kubenswrapper[4792]: I0309 09:51:36.941360 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:51:36 crc kubenswrapper[4792]: I0309 09:51:36.950536 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbtrp"] Mar 09 09:51:37 crc kubenswrapper[4792]: I0309 09:51:37.038788 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-catalog-content\") pod \"redhat-operators-qbtrp\" (UID: \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\") " pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:51:37 crc kubenswrapper[4792]: I0309 09:51:37.038847 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjblk\" (UniqueName: \"kubernetes.io/projected/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-kube-api-access-rjblk\") pod \"redhat-operators-qbtrp\" (UID: \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\") " pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:51:37 crc kubenswrapper[4792]: I0309 09:51:37.038992 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-utilities\") pod \"redhat-operators-qbtrp\" (UID: \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\") " pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:51:37 crc kubenswrapper[4792]: I0309 09:51:37.140566 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-utilities\") pod \"redhat-operators-qbtrp\" (UID: \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\") " pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:51:37 crc kubenswrapper[4792]: I0309 09:51:37.140695 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-catalog-content\") pod \"redhat-operators-qbtrp\" (UID: \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\") " pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:51:37 crc kubenswrapper[4792]: I0309 09:51:37.140726 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjblk\" (UniqueName: \"kubernetes.io/projected/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-kube-api-access-rjblk\") pod \"redhat-operators-qbtrp\" (UID: \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\") " pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:51:37 crc kubenswrapper[4792]: I0309 09:51:37.141400 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-utilities\") pod \"redhat-operators-qbtrp\" (UID: \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\") " pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:51:37 crc kubenswrapper[4792]: I0309 09:51:37.141400 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-catalog-content\") pod \"redhat-operators-qbtrp\" (UID: \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\") " pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:51:37 crc kubenswrapper[4792]: I0309 09:51:37.169002 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjblk\" (UniqueName: \"kubernetes.io/projected/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-kube-api-access-rjblk\") pod \"redhat-operators-qbtrp\" (UID: \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\") " pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:51:37 crc kubenswrapper[4792]: I0309 09:51:37.266270 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:51:37 crc kubenswrapper[4792]: I0309 09:51:37.913157 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbtrp"] Mar 09 09:51:38 crc kubenswrapper[4792]: I0309 09:51:38.634224 4792 generic.go:334] "Generic (PLEG): container finished" podID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerID="274a142324d47aa861d5dcc16aef19b614eac660c9d9a47c19b44ccff56ffad3" exitCode=0 Mar 09 09:51:38 crc kubenswrapper[4792]: I0309 09:51:38.634481 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbtrp" event={"ID":"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0","Type":"ContainerDied","Data":"274a142324d47aa861d5dcc16aef19b614eac660c9d9a47c19b44ccff56ffad3"} Mar 09 09:51:38 crc kubenswrapper[4792]: I0309 09:51:38.634605 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbtrp" event={"ID":"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0","Type":"ContainerStarted","Data":"6db5a9fe79759af4c6b4eaa7eac010050c12fdf3811c200e050666a3978e5177"} Mar 09 09:51:39 crc kubenswrapper[4792]: I0309 09:51:39.646165 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbtrp" event={"ID":"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0","Type":"ContainerStarted","Data":"50a6072c2456dd6a72b5e769f33bbf4ac021687990f5d22c2099261842f97901"} Mar 09 09:51:44 crc kubenswrapper[4792]: I0309 09:51:44.695508 4792 generic.go:334] "Generic (PLEG): container finished" podID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerID="50a6072c2456dd6a72b5e769f33bbf4ac021687990f5d22c2099261842f97901" exitCode=0 Mar 09 09:51:44 crc kubenswrapper[4792]: I0309 09:51:44.695616 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbtrp" event={"ID":"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0","Type":"ContainerDied","Data":"50a6072c2456dd6a72b5e769f33bbf4ac021687990f5d22c2099261842f97901"} Mar 09 09:51:46 crc kubenswrapper[4792]: I0309 09:51:46.720502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbtrp" event={"ID":"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0","Type":"ContainerStarted","Data":"1ab259a4c7c167f94d1d8b13c7088cab31186b9f66b2ee02297477306123c4d9"} Mar 09 09:51:46 crc kubenswrapper[4792]: I0309 09:51:46.754565 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qbtrp" podStartSLOduration=3.247846372 podStartE2EDuration="10.75454472s" podCreationTimestamp="2026-03-09 09:51:36 +0000 UTC" firstStartedPulling="2026-03-09 09:51:38.635881448 +0000 UTC m=+2663.666082200" lastFinishedPulling="2026-03-09 09:51:46.142579786 +0000 UTC m=+2671.172780548" observedRunningTime="2026-03-09 09:51:46.746773838 +0000 UTC m=+2671.776974600" watchObservedRunningTime="2026-03-09 09:51:46.75454472 +0000 UTC m=+2671.784745472" Mar 09 09:51:47 crc kubenswrapper[4792]: I0309 09:51:47.266562 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:51:47 crc kubenswrapper[4792]: I0309 09:51:47.266619 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:51:48 crc kubenswrapper[4792]: I0309 09:51:48.316009 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qbtrp" podUID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerName="registry-server" probeResult="failure" output=< Mar 09 09:51:48 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 09:51:48 crc kubenswrapper[4792]: > Mar 09 09:51:58 crc kubenswrapper[4792]: I0309 09:51:58.314753 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qbtrp" podUID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerName="registry-server" probeResult="failure" output=< Mar 09 09:51:58 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 09:51:58 crc kubenswrapper[4792]: > Mar 09 09:52:00 crc kubenswrapper[4792]: I0309 09:52:00.142842 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550832-xbtmw"] Mar 09 09:52:00 crc kubenswrapper[4792]: I0309 09:52:00.144422 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550832-xbtmw" Mar 09 09:52:00 crc kubenswrapper[4792]: I0309 09:52:00.153522 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:52:00 crc kubenswrapper[4792]: I0309 09:52:00.153805 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:52:00 crc kubenswrapper[4792]: I0309 09:52:00.154165 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:52:00 crc kubenswrapper[4792]: I0309 09:52:00.154208 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550832-xbtmw"] Mar 09 09:52:00 crc kubenswrapper[4792]: I0309 09:52:00.276523 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxnzv\" (UniqueName: \"kubernetes.io/projected/268467a6-7249-4a89-ba21-f60f71ec2336-kube-api-access-rxnzv\") pod \"auto-csr-approver-29550832-xbtmw\" (UID: \"268467a6-7249-4a89-ba21-f60f71ec2336\") " pod="openshift-infra/auto-csr-approver-29550832-xbtmw" Mar 09 09:52:00 crc kubenswrapper[4792]: I0309 09:52:00.377916 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxnzv\" (UniqueName: \"kubernetes.io/projected/268467a6-7249-4a89-ba21-f60f71ec2336-kube-api-access-rxnzv\") pod \"auto-csr-approver-29550832-xbtmw\" (UID: \"268467a6-7249-4a89-ba21-f60f71ec2336\") " pod="openshift-infra/auto-csr-approver-29550832-xbtmw" Mar 09 09:52:00 crc kubenswrapper[4792]: I0309 09:52:00.401509 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxnzv\" (UniqueName: \"kubernetes.io/projected/268467a6-7249-4a89-ba21-f60f71ec2336-kube-api-access-rxnzv\") pod \"auto-csr-approver-29550832-xbtmw\" (UID: \"268467a6-7249-4a89-ba21-f60f71ec2336\") " pod="openshift-infra/auto-csr-approver-29550832-xbtmw" Mar 09 09:52:00 crc kubenswrapper[4792]: I0309 09:52:00.473254 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550832-xbtmw" Mar 09 09:52:00 crc kubenswrapper[4792]: I0309 09:52:00.952802 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550832-xbtmw"] Mar 09 09:52:01 crc kubenswrapper[4792]: I0309 09:52:01.861782 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550832-xbtmw" event={"ID":"268467a6-7249-4a89-ba21-f60f71ec2336","Type":"ContainerStarted","Data":"5d3efb3eb889bd42baa5e45ac2d9e74b5ca3bf4a3886e710e5582e5b9137c7cb"} Mar 09 09:52:02 crc kubenswrapper[4792]: I0309 09:52:02.871419 4792 generic.go:334] "Generic (PLEG): container finished" podID="268467a6-7249-4a89-ba21-f60f71ec2336" containerID="18df2e4013f6415c9778a13958cdced71b608bf187983fd5da52144d930f82de" exitCode=0 Mar 09 09:52:02 crc kubenswrapper[4792]: I0309 09:52:02.871881 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550832-xbtmw" event={"ID":"268467a6-7249-4a89-ba21-f60f71ec2336","Type":"ContainerDied","Data":"18df2e4013f6415c9778a13958cdced71b608bf187983fd5da52144d930f82de"} Mar 09 09:52:04 crc kubenswrapper[4792]: I0309 09:52:04.211703 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550832-xbtmw" Mar 09 09:52:04 crc kubenswrapper[4792]: I0309 09:52:04.358369 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxnzv\" (UniqueName: \"kubernetes.io/projected/268467a6-7249-4a89-ba21-f60f71ec2336-kube-api-access-rxnzv\") pod \"268467a6-7249-4a89-ba21-f60f71ec2336\" (UID: \"268467a6-7249-4a89-ba21-f60f71ec2336\") " Mar 09 09:52:04 crc kubenswrapper[4792]: I0309 09:52:04.365818 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268467a6-7249-4a89-ba21-f60f71ec2336-kube-api-access-rxnzv" (OuterVolumeSpecName: "kube-api-access-rxnzv") pod "268467a6-7249-4a89-ba21-f60f71ec2336" (UID: "268467a6-7249-4a89-ba21-f60f71ec2336"). InnerVolumeSpecName "kube-api-access-rxnzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:04 crc kubenswrapper[4792]: I0309 09:52:04.460309 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxnzv\" (UniqueName: \"kubernetes.io/projected/268467a6-7249-4a89-ba21-f60f71ec2336-kube-api-access-rxnzv\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:04 crc kubenswrapper[4792]: I0309 09:52:04.889434 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550832-xbtmw" event={"ID":"268467a6-7249-4a89-ba21-f60f71ec2336","Type":"ContainerDied","Data":"5d3efb3eb889bd42baa5e45ac2d9e74b5ca3bf4a3886e710e5582e5b9137c7cb"} Mar 09 09:52:04 crc kubenswrapper[4792]: I0309 09:52:04.889476 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3efb3eb889bd42baa5e45ac2d9e74b5ca3bf4a3886e710e5582e5b9137c7cb" Mar 09 09:52:04 crc kubenswrapper[4792]: I0309 09:52:04.889490 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550832-xbtmw" Mar 09 09:52:05 crc kubenswrapper[4792]: I0309 09:52:05.278921 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550826-xl478"] Mar 09 09:52:05 crc kubenswrapper[4792]: I0309 09:52:05.288469 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550826-xl478"] Mar 09 09:52:05 crc kubenswrapper[4792]: I0309 09:52:05.673503 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37698dc4-643b-47b2-94a7-7a6be39c9bd2" path="/var/lib/kubelet/pods/37698dc4-643b-47b2-94a7-7a6be39c9bd2/volumes" Mar 09 09:52:08 crc kubenswrapper[4792]: I0309 09:52:08.314869 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qbtrp" podUID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerName="registry-server" probeResult="failure" output=< Mar 09 09:52:08 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 09:52:08 crc kubenswrapper[4792]: > Mar 09 09:52:13 crc kubenswrapper[4792]: I0309 09:52:13.214304 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:52:13 crc kubenswrapper[4792]: I0309 09:52:13.214669 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:52:18 crc kubenswrapper[4792]: I0309 09:52:18.310051 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qbtrp" podUID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerName="registry-server" probeResult="failure" output=< Mar 09 09:52:18 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 09:52:18 crc kubenswrapper[4792]: > Mar 09 09:52:19 crc kubenswrapper[4792]: I0309 09:52:19.016422 4792 generic.go:334] "Generic (PLEG): container finished" podID="c5a32778-1a93-440b-9f56-d0bded50a725" containerID="b1e8a3d51d053da599390e2b780e3ca312329877273b78bb5263562584788469" exitCode=0 Mar 09 09:52:19 crc kubenswrapper[4792]: I0309 09:52:19.016475 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" event={"ID":"c5a32778-1a93-440b-9f56-d0bded50a725","Type":"ContainerDied","Data":"b1e8a3d51d053da599390e2b780e3ca312329877273b78bb5263562584788469"} Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.449082 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.470912 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-inventory\") pod \"c5a32778-1a93-440b-9f56-d0bded50a725\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.471097 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ssh-key-openstack-edpm-ipam\") pod \"c5a32778-1a93-440b-9f56-d0bded50a725\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.471254 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ceph\") pod \"c5a32778-1a93-440b-9f56-d0bded50a725\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.471295 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c5a32778-1a93-440b-9f56-d0bded50a725-ovncontroller-config-0\") pod \"c5a32778-1a93-440b-9f56-d0bded50a725\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.471346 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ovn-combined-ca-bundle\") pod \"c5a32778-1a93-440b-9f56-d0bded50a725\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.471479 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzhk9\" (UniqueName: \"kubernetes.io/projected/c5a32778-1a93-440b-9f56-d0bded50a725-kube-api-access-nzhk9\") pod \"c5a32778-1a93-440b-9f56-d0bded50a725\" (UID: \"c5a32778-1a93-440b-9f56-d0bded50a725\") " Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.490997 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c5a32778-1a93-440b-9f56-d0bded50a725" (UID: "c5a32778-1a93-440b-9f56-d0bded50a725"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.494151 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5a32778-1a93-440b-9f56-d0bded50a725-kube-api-access-nzhk9" (OuterVolumeSpecName: "kube-api-access-nzhk9") pod "c5a32778-1a93-440b-9f56-d0bded50a725" (UID: "c5a32778-1a93-440b-9f56-d0bded50a725"). InnerVolumeSpecName "kube-api-access-nzhk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.494681 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ceph" (OuterVolumeSpecName: "ceph") pod "c5a32778-1a93-440b-9f56-d0bded50a725" (UID: "c5a32778-1a93-440b-9f56-d0bded50a725"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.506021 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c5a32778-1a93-440b-9f56-d0bded50a725" (UID: "c5a32778-1a93-440b-9f56-d0bded50a725"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.513337 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-inventory" (OuterVolumeSpecName: "inventory") pod "c5a32778-1a93-440b-9f56-d0bded50a725" (UID: "c5a32778-1a93-440b-9f56-d0bded50a725"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.520376 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5a32778-1a93-440b-9f56-d0bded50a725-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "c5a32778-1a93-440b-9f56-d0bded50a725" (UID: "c5a32778-1a93-440b-9f56-d0bded50a725"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.584929 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.585006 4792 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c5a32778-1a93-440b-9f56-d0bded50a725-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.585023 4792 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.585046 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzhk9\" (UniqueName: \"kubernetes.io/projected/c5a32778-1a93-440b-9f56-d0bded50a725-kube-api-access-nzhk9\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.585058 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:20 crc kubenswrapper[4792]: I0309 09:52:20.585088 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5a32778-1a93-440b-9f56-d0bded50a725-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.033573 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" event={"ID":"c5a32778-1a93-440b-9f56-d0bded50a725","Type":"ContainerDied","Data":"5bdbf5e95c7da63dcf87670d059c7238af157fe480689c0200eefb49df90998a"} Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.033628 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bdbf5e95c7da63dcf87670d059c7238af157fe480689c0200eefb49df90998a" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.033665 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7c2rh" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.156882 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h"] Mar 09 09:52:21 crc kubenswrapper[4792]: E0309 09:52:21.157364 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268467a6-7249-4a89-ba21-f60f71ec2336" containerName="oc" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.157386 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="268467a6-7249-4a89-ba21-f60f71ec2336" containerName="oc" Mar 09 09:52:21 crc kubenswrapper[4792]: E0309 09:52:21.157411 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a32778-1a93-440b-9f56-d0bded50a725" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.157420 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a32778-1a93-440b-9f56-d0bded50a725" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.157815 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="268467a6-7249-4a89-ba21-f60f71ec2336" containerName="oc" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.157869 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a32778-1a93-440b-9f56-d0bded50a725" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.158963 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.164695 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.164766 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.166293 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.166363 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.166425 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.166526 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.166999 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h"] Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.172558 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.199817 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.199961 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.199992 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.200100 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.200368 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.200433 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmbcz\" (UniqueName: \"kubernetes.io/projected/2e835834-d2ca-414a-b567-8364c4b208e5-kube-api-access-hmbcz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.200520 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.302874 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.303455 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.303564 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmbcz\" (UniqueName: \"kubernetes.io/projected/2e835834-d2ca-414a-b567-8364c4b208e5-kube-api-access-hmbcz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.303631 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.303671 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.303719 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.303743 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.307625 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.308310 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.312639 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.317355 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.317605 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.323035 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.340232 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmbcz\" (UniqueName: \"kubernetes.io/projected/2e835834-d2ca-414a-b567-8364c4b208e5-kube-api-access-hmbcz\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:21 crc kubenswrapper[4792]: I0309 09:52:21.475963 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:52:22 crc kubenswrapper[4792]: I0309 09:52:22.099728 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h"] Mar 09 09:52:23 crc kubenswrapper[4792]: I0309 09:52:23.063318 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" event={"ID":"2e835834-d2ca-414a-b567-8364c4b208e5","Type":"ContainerStarted","Data":"d3ff501b87b23e3cfdefd8e45b4455fcee47804459677a74fa0f76f1e2435791"} Mar 09 09:52:23 crc kubenswrapper[4792]: I0309 09:52:23.064016 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" event={"ID":"2e835834-d2ca-414a-b567-8364c4b208e5","Type":"ContainerStarted","Data":"6e6f3b80f519350c01f44fca82efcc70a1dea06f7dbdf4ff19f2d68da411c99c"} Mar 09 09:52:23 crc kubenswrapper[4792]: I0309 09:52:23.092431 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" podStartSLOduration=1.6291546110000001 podStartE2EDuration="2.09240978s" podCreationTimestamp="2026-03-09 09:52:21 +0000 UTC" firstStartedPulling="2026-03-09 09:52:22.110335019 +0000 UTC m=+2707.140535761" lastFinishedPulling="2026-03-09 09:52:22.573590178 +0000 UTC m=+2707.603790930" observedRunningTime="2026-03-09 09:52:23.080785723 +0000 UTC m=+2708.110986495" watchObservedRunningTime="2026-03-09 09:52:23.09240978 +0000 UTC m=+2708.122610532" Mar 09 09:52:27 crc kubenswrapper[4792]: I0309 09:52:27.316932 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:52:27 crc kubenswrapper[4792]: I0309 09:52:27.378863 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:52:27 crc kubenswrapper[4792]: I0309 09:52:27.565186 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qbtrp"] Mar 09 09:52:29 crc kubenswrapper[4792]: I0309 09:52:29.119970 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qbtrp" podUID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerName="registry-server" containerID="cri-o://1ab259a4c7c167f94d1d8b13c7088cab31186b9f66b2ee02297477306123c4d9" gracePeriod=2 Mar 09 09:52:29 crc kubenswrapper[4792]: I0309 09:52:29.533520 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:52:29 crc kubenswrapper[4792]: I0309 09:52:29.651559 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjblk\" (UniqueName: \"kubernetes.io/projected/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-kube-api-access-rjblk\") pod \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\" (UID: \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\") " Mar 09 09:52:29 crc kubenswrapper[4792]: I0309 09:52:29.651888 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-catalog-content\") pod \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\" (UID: \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\") " Mar 09 09:52:29 crc kubenswrapper[4792]: I0309 09:52:29.651979 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-utilities\") pod \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\" (UID: \"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0\") " Mar 09 09:52:29 crc kubenswrapper[4792]: I0309 09:52:29.652740 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-utilities" (OuterVolumeSpecName: "utilities") pod "0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" (UID: "0fb276d4-26dd-43a7-84d2-dba1f81f5bd0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:29 crc kubenswrapper[4792]: I0309 09:52:29.657661 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-kube-api-access-rjblk" (OuterVolumeSpecName: "kube-api-access-rjblk") pod "0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" (UID: "0fb276d4-26dd-43a7-84d2-dba1f81f5bd0"). InnerVolumeSpecName "kube-api-access-rjblk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:29 crc kubenswrapper[4792]: I0309 09:52:29.754681 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjblk\" (UniqueName: \"kubernetes.io/projected/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-kube-api-access-rjblk\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:29 crc kubenswrapper[4792]: I0309 09:52:29.754743 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:29 crc kubenswrapper[4792]: I0309 09:52:29.805298 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" (UID: "0fb276d4-26dd-43a7-84d2-dba1f81f5bd0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:52:29 crc kubenswrapper[4792]: I0309 09:52:29.862130 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.130675 4792 generic.go:334] "Generic (PLEG): container finished" podID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerID="1ab259a4c7c167f94d1d8b13c7088cab31186b9f66b2ee02297477306123c4d9" exitCode=0 Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.130722 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbtrp" event={"ID":"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0","Type":"ContainerDied","Data":"1ab259a4c7c167f94d1d8b13c7088cab31186b9f66b2ee02297477306123c4d9"} Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.130731 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbtrp" Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.130753 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbtrp" event={"ID":"0fb276d4-26dd-43a7-84d2-dba1f81f5bd0","Type":"ContainerDied","Data":"6db5a9fe79759af4c6b4eaa7eac010050c12fdf3811c200e050666a3978e5177"} Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.130772 4792 scope.go:117] "RemoveContainer" containerID="1ab259a4c7c167f94d1d8b13c7088cab31186b9f66b2ee02297477306123c4d9" Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.162440 4792 scope.go:117] "RemoveContainer" containerID="50a6072c2456dd6a72b5e769f33bbf4ac021687990f5d22c2099261842f97901" Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.168915 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qbtrp"] Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.176325 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qbtrp"] Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.185380 4792 scope.go:117] "RemoveContainer" containerID="274a142324d47aa861d5dcc16aef19b614eac660c9d9a47c19b44ccff56ffad3" Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.224147 4792 scope.go:117] "RemoveContainer" containerID="1ab259a4c7c167f94d1d8b13c7088cab31186b9f66b2ee02297477306123c4d9" Mar 09 09:52:30 crc kubenswrapper[4792]: E0309 09:52:30.224573 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab259a4c7c167f94d1d8b13c7088cab31186b9f66b2ee02297477306123c4d9\": container with ID starting with 1ab259a4c7c167f94d1d8b13c7088cab31186b9f66b2ee02297477306123c4d9 not found: ID does not exist" containerID="1ab259a4c7c167f94d1d8b13c7088cab31186b9f66b2ee02297477306123c4d9" Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.224606 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab259a4c7c167f94d1d8b13c7088cab31186b9f66b2ee02297477306123c4d9"} err="failed to get container status \"1ab259a4c7c167f94d1d8b13c7088cab31186b9f66b2ee02297477306123c4d9\": rpc error: code = NotFound desc = could not find container \"1ab259a4c7c167f94d1d8b13c7088cab31186b9f66b2ee02297477306123c4d9\": container with ID starting with 1ab259a4c7c167f94d1d8b13c7088cab31186b9f66b2ee02297477306123c4d9 not found: ID does not exist" Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.224626 4792 scope.go:117] "RemoveContainer" containerID="50a6072c2456dd6a72b5e769f33bbf4ac021687990f5d22c2099261842f97901" Mar 09 09:52:30 crc kubenswrapper[4792]: E0309 09:52:30.225250 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50a6072c2456dd6a72b5e769f33bbf4ac021687990f5d22c2099261842f97901\": container with ID starting with 50a6072c2456dd6a72b5e769f33bbf4ac021687990f5d22c2099261842f97901 not found: ID does not exist" containerID="50a6072c2456dd6a72b5e769f33bbf4ac021687990f5d22c2099261842f97901" Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.225377 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50a6072c2456dd6a72b5e769f33bbf4ac021687990f5d22c2099261842f97901"} err="failed to get container status \"50a6072c2456dd6a72b5e769f33bbf4ac021687990f5d22c2099261842f97901\": rpc error: code = NotFound desc = could not find container \"50a6072c2456dd6a72b5e769f33bbf4ac021687990f5d22c2099261842f97901\": container with ID starting with 50a6072c2456dd6a72b5e769f33bbf4ac021687990f5d22c2099261842f97901 not found: ID does not exist" Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.225478 4792 scope.go:117] "RemoveContainer" containerID="274a142324d47aa861d5dcc16aef19b614eac660c9d9a47c19b44ccff56ffad3" Mar 09 09:52:30 crc kubenswrapper[4792]: E0309 09:52:30.225922 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"274a142324d47aa861d5dcc16aef19b614eac660c9d9a47c19b44ccff56ffad3\": container with ID starting with 274a142324d47aa861d5dcc16aef19b614eac660c9d9a47c19b44ccff56ffad3 not found: ID does not exist" containerID="274a142324d47aa861d5dcc16aef19b614eac660c9d9a47c19b44ccff56ffad3" Mar 09 09:52:30 crc kubenswrapper[4792]: I0309 09:52:30.225951 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"274a142324d47aa861d5dcc16aef19b614eac660c9d9a47c19b44ccff56ffad3"} err="failed to get container status \"274a142324d47aa861d5dcc16aef19b614eac660c9d9a47c19b44ccff56ffad3\": rpc error: code = NotFound desc = could not find container \"274a142324d47aa861d5dcc16aef19b614eac660c9d9a47c19b44ccff56ffad3\": container with ID starting with 274a142324d47aa861d5dcc16aef19b614eac660c9d9a47c19b44ccff56ffad3 not found: ID does not exist" Mar 09 09:52:31 crc kubenswrapper[4792]: I0309 09:52:31.679140 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" path="/var/lib/kubelet/pods/0fb276d4-26dd-43a7-84d2-dba1f81f5bd0/volumes" Mar 09 09:52:39 crc kubenswrapper[4792]: I0309 09:52:39.994149 4792 scope.go:117] "RemoveContainer" containerID="53ecba3a9eb9472e76a04b73068aa4a370428c39f236c74a55bd9ce0a0302e23" Mar 09 09:52:43 crc kubenswrapper[4792]: I0309 09:52:43.214715 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:52:43 crc kubenswrapper[4792]: I0309 09:52:43.215321 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:52:52 crc kubenswrapper[4792]: I0309 09:52:52.868648 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rvrlq"] Mar 09 09:52:52 crc kubenswrapper[4792]: E0309 09:52:52.869612 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerName="extract-utilities" Mar 09 09:52:52 crc kubenswrapper[4792]: I0309 09:52:52.869631 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerName="extract-utilities" Mar 09 09:52:52 crc kubenswrapper[4792]: E0309 09:52:52.869667 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerName="extract-content" Mar 09 09:52:52 crc kubenswrapper[4792]: I0309 09:52:52.869676 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerName="extract-content" Mar 09 09:52:52 crc kubenswrapper[4792]: E0309 09:52:52.869697 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerName="registry-server" Mar 09 09:52:52 crc kubenswrapper[4792]: I0309 09:52:52.869704 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerName="registry-server" Mar 09 09:52:52 crc kubenswrapper[4792]: I0309 09:52:52.869882 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb276d4-26dd-43a7-84d2-dba1f81f5bd0" containerName="registry-server" Mar 09 09:52:52 crc kubenswrapper[4792]: I0309 09:52:52.871127 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:52:52 crc kubenswrapper[4792]: I0309 09:52:52.887403 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvrlq"] Mar 09 09:52:52 crc kubenswrapper[4792]: I0309 09:52:52.961749 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91196eb5-73f9-45bc-815e-deab952b3d26-utilities\") pod \"certified-operators-rvrlq\" (UID: \"91196eb5-73f9-45bc-815e-deab952b3d26\") " pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:52:52 crc kubenswrapper[4792]: I0309 09:52:52.962227 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26hml\" (UniqueName: \"kubernetes.io/projected/91196eb5-73f9-45bc-815e-deab952b3d26-kube-api-access-26hml\") pod \"certified-operators-rvrlq\" (UID: \"91196eb5-73f9-45bc-815e-deab952b3d26\") " pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:52:52 crc kubenswrapper[4792]: I0309 09:52:52.962306 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91196eb5-73f9-45bc-815e-deab952b3d26-catalog-content\") pod \"certified-operators-rvrlq\" (UID: \"91196eb5-73f9-45bc-815e-deab952b3d26\") " pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:52:53 crc kubenswrapper[4792]: I0309 09:52:53.064086 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91196eb5-73f9-45bc-815e-deab952b3d26-catalog-content\") pod \"certified-operators-rvrlq\" (UID: \"91196eb5-73f9-45bc-815e-deab952b3d26\") " pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:52:53 crc kubenswrapper[4792]: I0309 09:52:53.064172 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91196eb5-73f9-45bc-815e-deab952b3d26-utilities\") pod \"certified-operators-rvrlq\" (UID: \"91196eb5-73f9-45bc-815e-deab952b3d26\") " pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:52:53 crc kubenswrapper[4792]: I0309 09:52:53.064292 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26hml\" (UniqueName: \"kubernetes.io/projected/91196eb5-73f9-45bc-815e-deab952b3d26-kube-api-access-26hml\") pod \"certified-operators-rvrlq\" (UID: \"91196eb5-73f9-45bc-815e-deab952b3d26\") " pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:52:53 crc kubenswrapper[4792]: I0309 09:52:53.064646 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91196eb5-73f9-45bc-815e-deab952b3d26-catalog-content\") pod \"certified-operators-rvrlq\" (UID: \"91196eb5-73f9-45bc-815e-deab952b3d26\") " pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:52:53 crc kubenswrapper[4792]: I0309 09:52:53.064748 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91196eb5-73f9-45bc-815e-deab952b3d26-utilities\") pod \"certified-operators-rvrlq\" (UID: \"91196eb5-73f9-45bc-815e-deab952b3d26\") " pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:52:53 crc kubenswrapper[4792]: I0309 09:52:53.090742 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26hml\" (UniqueName: \"kubernetes.io/projected/91196eb5-73f9-45bc-815e-deab952b3d26-kube-api-access-26hml\") pod \"certified-operators-rvrlq\" (UID: \"91196eb5-73f9-45bc-815e-deab952b3d26\") " pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:52:53 crc kubenswrapper[4792]: I0309 09:52:53.198039 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:52:53 crc kubenswrapper[4792]: I0309 09:52:53.731388 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvrlq"] Mar 09 09:52:53 crc kubenswrapper[4792]: W0309 09:52:53.737453 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91196eb5_73f9_45bc_815e_deab952b3d26.slice/crio-3469d2ac292fd68286078291b94f4bf089f14248da8cf2f3db99aa2360f36189 WatchSource:0}: Error finding container 3469d2ac292fd68286078291b94f4bf089f14248da8cf2f3db99aa2360f36189: Status 404 returned error can't find the container with id 3469d2ac292fd68286078291b94f4bf089f14248da8cf2f3db99aa2360f36189 Mar 09 09:52:54 crc kubenswrapper[4792]: I0309 09:52:54.378954 4792 generic.go:334] "Generic (PLEG): container finished" podID="91196eb5-73f9-45bc-815e-deab952b3d26" containerID="d14e7fa1cecac8fc06a8abeaa80259016caf151f852014a60f1172444dfad61a" exitCode=0 Mar 09 09:52:54 crc kubenswrapper[4792]: I0309 09:52:54.379201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvrlq" event={"ID":"91196eb5-73f9-45bc-815e-deab952b3d26","Type":"ContainerDied","Data":"d14e7fa1cecac8fc06a8abeaa80259016caf151f852014a60f1172444dfad61a"} Mar 09 09:52:54 crc kubenswrapper[4792]: I0309 09:52:54.379274 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvrlq" event={"ID":"91196eb5-73f9-45bc-815e-deab952b3d26","Type":"ContainerStarted","Data":"3469d2ac292fd68286078291b94f4bf089f14248da8cf2f3db99aa2360f36189"} Mar 09 09:52:54 crc kubenswrapper[4792]: I0309 09:52:54.381133 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:52:56 crc kubenswrapper[4792]: I0309 09:52:56.403590 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvrlq" event={"ID":"91196eb5-73f9-45bc-815e-deab952b3d26","Type":"ContainerStarted","Data":"d92a22529ad0ce95845d249c6291e5259c4316307b89dbb71d75679fa038bad5"} Mar 09 09:52:58 crc kubenswrapper[4792]: I0309 09:52:58.421603 4792 generic.go:334] "Generic (PLEG): container finished" podID="91196eb5-73f9-45bc-815e-deab952b3d26" containerID="d92a22529ad0ce95845d249c6291e5259c4316307b89dbb71d75679fa038bad5" exitCode=0 Mar 09 09:52:58 crc kubenswrapper[4792]: I0309 09:52:58.421686 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvrlq" event={"ID":"91196eb5-73f9-45bc-815e-deab952b3d26","Type":"ContainerDied","Data":"d92a22529ad0ce95845d249c6291e5259c4316307b89dbb71d75679fa038bad5"} Mar 09 09:52:59 crc kubenswrapper[4792]: I0309 09:52:59.431360 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvrlq" event={"ID":"91196eb5-73f9-45bc-815e-deab952b3d26","Type":"ContainerStarted","Data":"755eb087177ab5c413e4c5fd8dde6d047206ffeccf150d3f50305d8430c6fc81"} Mar 09 09:52:59 crc kubenswrapper[4792]: I0309 09:52:59.454957 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rvrlq" podStartSLOduration=3.023630795 podStartE2EDuration="7.454935718s" podCreationTimestamp="2026-03-09 09:52:52 +0000 UTC" firstStartedPulling="2026-03-09 09:52:54.380902214 +0000 UTC m=+2739.411102966" lastFinishedPulling="2026-03-09 09:52:58.812207137 +0000 UTC m=+2743.842407889" observedRunningTime="2026-03-09 09:52:59.450160928 +0000 UTC m=+2744.480361700" watchObservedRunningTime="2026-03-09 09:52:59.454935718 +0000 UTC m=+2744.485136480" Mar 09 09:53:03 crc kubenswrapper[4792]: I0309 09:53:03.198413 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:53:03 crc kubenswrapper[4792]: I0309 09:53:03.200237 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:53:03 crc kubenswrapper[4792]: I0309 09:53:03.245467 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:53:03 crc kubenswrapper[4792]: I0309 09:53:03.517530 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:53:03 crc kubenswrapper[4792]: I0309 09:53:03.582030 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvrlq"] Mar 09 09:53:05 crc kubenswrapper[4792]: I0309 09:53:05.475572 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rvrlq" podUID="91196eb5-73f9-45bc-815e-deab952b3d26" containerName="registry-server" containerID="cri-o://755eb087177ab5c413e4c5fd8dde6d047206ffeccf150d3f50305d8430c6fc81" gracePeriod=2 Mar 09 09:53:05 crc kubenswrapper[4792]: I0309 09:53:05.954715 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.114645 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91196eb5-73f9-45bc-815e-deab952b3d26-catalog-content\") pod \"91196eb5-73f9-45bc-815e-deab952b3d26\" (UID: \"91196eb5-73f9-45bc-815e-deab952b3d26\") " Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.114760 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26hml\" (UniqueName: \"kubernetes.io/projected/91196eb5-73f9-45bc-815e-deab952b3d26-kube-api-access-26hml\") pod \"91196eb5-73f9-45bc-815e-deab952b3d26\" (UID: \"91196eb5-73f9-45bc-815e-deab952b3d26\") " Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.114807 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91196eb5-73f9-45bc-815e-deab952b3d26-utilities\") pod \"91196eb5-73f9-45bc-815e-deab952b3d26\" (UID: \"91196eb5-73f9-45bc-815e-deab952b3d26\") " Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.116647 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91196eb5-73f9-45bc-815e-deab952b3d26-utilities" (OuterVolumeSpecName: "utilities") pod "91196eb5-73f9-45bc-815e-deab952b3d26" (UID: "91196eb5-73f9-45bc-815e-deab952b3d26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.124447 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91196eb5-73f9-45bc-815e-deab952b3d26-kube-api-access-26hml" (OuterVolumeSpecName: "kube-api-access-26hml") pod "91196eb5-73f9-45bc-815e-deab952b3d26" (UID: "91196eb5-73f9-45bc-815e-deab952b3d26"). InnerVolumeSpecName "kube-api-access-26hml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.216921 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91196eb5-73f9-45bc-815e-deab952b3d26-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.216967 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26hml\" (UniqueName: \"kubernetes.io/projected/91196eb5-73f9-45bc-815e-deab952b3d26-kube-api-access-26hml\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.395206 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91196eb5-73f9-45bc-815e-deab952b3d26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91196eb5-73f9-45bc-815e-deab952b3d26" (UID: "91196eb5-73f9-45bc-815e-deab952b3d26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.421124 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91196eb5-73f9-45bc-815e-deab952b3d26-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.487424 4792 generic.go:334] "Generic (PLEG): container finished" podID="91196eb5-73f9-45bc-815e-deab952b3d26" containerID="755eb087177ab5c413e4c5fd8dde6d047206ffeccf150d3f50305d8430c6fc81" exitCode=0 Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.487469 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvrlq" event={"ID":"91196eb5-73f9-45bc-815e-deab952b3d26","Type":"ContainerDied","Data":"755eb087177ab5c413e4c5fd8dde6d047206ffeccf150d3f50305d8430c6fc81"} Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.487502 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvrlq" event={"ID":"91196eb5-73f9-45bc-815e-deab952b3d26","Type":"ContainerDied","Data":"3469d2ac292fd68286078291b94f4bf089f14248da8cf2f3db99aa2360f36189"} Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.487522 4792 scope.go:117] "RemoveContainer" containerID="755eb087177ab5c413e4c5fd8dde6d047206ffeccf150d3f50305d8430c6fc81" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.488930 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvrlq" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.508899 4792 scope.go:117] "RemoveContainer" containerID="d92a22529ad0ce95845d249c6291e5259c4316307b89dbb71d75679fa038bad5" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.544024 4792 scope.go:117] "RemoveContainer" containerID="d14e7fa1cecac8fc06a8abeaa80259016caf151f852014a60f1172444dfad61a" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.545093 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvrlq"] Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.554328 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rvrlq"] Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.578060 4792 scope.go:117] "RemoveContainer" containerID="755eb087177ab5c413e4c5fd8dde6d047206ffeccf150d3f50305d8430c6fc81" Mar 09 09:53:06 crc kubenswrapper[4792]: E0309 09:53:06.578944 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"755eb087177ab5c413e4c5fd8dde6d047206ffeccf150d3f50305d8430c6fc81\": container with ID starting with 755eb087177ab5c413e4c5fd8dde6d047206ffeccf150d3f50305d8430c6fc81 not found: ID does not exist" containerID="755eb087177ab5c413e4c5fd8dde6d047206ffeccf150d3f50305d8430c6fc81" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.578987 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"755eb087177ab5c413e4c5fd8dde6d047206ffeccf150d3f50305d8430c6fc81"} err="failed to get container status \"755eb087177ab5c413e4c5fd8dde6d047206ffeccf150d3f50305d8430c6fc81\": rpc error: code = NotFound desc = could not find container \"755eb087177ab5c413e4c5fd8dde6d047206ffeccf150d3f50305d8430c6fc81\": container with ID starting with 755eb087177ab5c413e4c5fd8dde6d047206ffeccf150d3f50305d8430c6fc81 not found: ID does not exist" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.579022 4792 scope.go:117] "RemoveContainer" containerID="d92a22529ad0ce95845d249c6291e5259c4316307b89dbb71d75679fa038bad5" Mar 09 09:53:06 crc kubenswrapper[4792]: E0309 09:53:06.579591 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d92a22529ad0ce95845d249c6291e5259c4316307b89dbb71d75679fa038bad5\": container with ID starting with d92a22529ad0ce95845d249c6291e5259c4316307b89dbb71d75679fa038bad5 not found: ID does not exist" containerID="d92a22529ad0ce95845d249c6291e5259c4316307b89dbb71d75679fa038bad5" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.579626 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d92a22529ad0ce95845d249c6291e5259c4316307b89dbb71d75679fa038bad5"} err="failed to get container status \"d92a22529ad0ce95845d249c6291e5259c4316307b89dbb71d75679fa038bad5\": rpc error: code = NotFound desc = could not find container \"d92a22529ad0ce95845d249c6291e5259c4316307b89dbb71d75679fa038bad5\": container with ID starting with d92a22529ad0ce95845d249c6291e5259c4316307b89dbb71d75679fa038bad5 not found: ID does not exist" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.579649 4792 scope.go:117] "RemoveContainer" containerID="d14e7fa1cecac8fc06a8abeaa80259016caf151f852014a60f1172444dfad61a" Mar 09 09:53:06 crc kubenswrapper[4792]: E0309 09:53:06.579956 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14e7fa1cecac8fc06a8abeaa80259016caf151f852014a60f1172444dfad61a\": container with ID starting with d14e7fa1cecac8fc06a8abeaa80259016caf151f852014a60f1172444dfad61a not found: ID does not exist" containerID="d14e7fa1cecac8fc06a8abeaa80259016caf151f852014a60f1172444dfad61a" Mar 09 09:53:06 crc kubenswrapper[4792]: I0309 09:53:06.579983 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14e7fa1cecac8fc06a8abeaa80259016caf151f852014a60f1172444dfad61a"} err="failed to get container status \"d14e7fa1cecac8fc06a8abeaa80259016caf151f852014a60f1172444dfad61a\": rpc error: code = NotFound desc = could not find container \"d14e7fa1cecac8fc06a8abeaa80259016caf151f852014a60f1172444dfad61a\": container with ID starting with d14e7fa1cecac8fc06a8abeaa80259016caf151f852014a60f1172444dfad61a not found: ID does not exist" Mar 09 09:53:07 crc kubenswrapper[4792]: I0309 09:53:07.674746 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91196eb5-73f9-45bc-815e-deab952b3d26" path="/var/lib/kubelet/pods/91196eb5-73f9-45bc-815e-deab952b3d26/volumes" Mar 09 09:53:13 crc kubenswrapper[4792]: I0309 09:53:13.213752 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:53:13 crc kubenswrapper[4792]: I0309 09:53:13.214146 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:53:13 crc kubenswrapper[4792]: I0309 09:53:13.214185 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:53:13 crc kubenswrapper[4792]: I0309 09:53:13.215212 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"862129b30016327d77e586feefb2263203c7afdd585cf1571fd31502117efa31"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:53:13 crc kubenswrapper[4792]: I0309 09:53:13.215348 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://862129b30016327d77e586feefb2263203c7afdd585cf1571fd31502117efa31" gracePeriod=600 Mar 09 09:53:13 crc kubenswrapper[4792]: I0309 09:53:13.568628 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="862129b30016327d77e586feefb2263203c7afdd585cf1571fd31502117efa31" exitCode=0 Mar 09 09:53:13 crc kubenswrapper[4792]: I0309 09:53:13.568692 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"862129b30016327d77e586feefb2263203c7afdd585cf1571fd31502117efa31"} Mar 09 09:53:13 crc kubenswrapper[4792]: I0309 09:53:13.568985 4792 scope.go:117] "RemoveContainer" containerID="24d9d0dcf9edc8581a732f09072e639940f290776165f66acae7f86c2095d368" Mar 09 09:53:14 crc kubenswrapper[4792]: I0309 09:53:14.580702 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93"} Mar 09 09:53:21 crc kubenswrapper[4792]: I0309 09:53:21.639871 4792 generic.go:334] "Generic (PLEG): container finished" podID="2e835834-d2ca-414a-b567-8364c4b208e5" containerID="d3ff501b87b23e3cfdefd8e45b4455fcee47804459677a74fa0f76f1e2435791" exitCode=0 Mar 09 09:53:21 crc kubenswrapper[4792]: I0309 09:53:21.640485 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" event={"ID":"2e835834-d2ca-414a-b567-8364c4b208e5","Type":"ContainerDied","Data":"d3ff501b87b23e3cfdefd8e45b4455fcee47804459677a74fa0f76f1e2435791"} Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.132673 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.254654 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-ceph\") pod \"2e835834-d2ca-414a-b567-8364c4b208e5\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.254718 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmbcz\" (UniqueName: \"kubernetes.io/projected/2e835834-d2ca-414a-b567-8364c4b208e5-kube-api-access-hmbcz\") pod \"2e835834-d2ca-414a-b567-8364c4b208e5\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.254740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-neutron-metadata-combined-ca-bundle\") pod \"2e835834-d2ca-414a-b567-8364c4b208e5\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.254775 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-inventory\") pod \"2e835834-d2ca-414a-b567-8364c4b208e5\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.254840 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-nova-metadata-neutron-config-0\") pod \"2e835834-d2ca-414a-b567-8364c4b208e5\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.254907 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"2e835834-d2ca-414a-b567-8364c4b208e5\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.254984 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-ssh-key-openstack-edpm-ipam\") pod \"2e835834-d2ca-414a-b567-8364c4b208e5\" (UID: \"2e835834-d2ca-414a-b567-8364c4b208e5\") " Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.262683 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-ceph" (OuterVolumeSpecName: "ceph") pod "2e835834-d2ca-414a-b567-8364c4b208e5" (UID: "2e835834-d2ca-414a-b567-8364c4b208e5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.269098 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "2e835834-d2ca-414a-b567-8364c4b208e5" (UID: "2e835834-d2ca-414a-b567-8364c4b208e5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.278138 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e835834-d2ca-414a-b567-8364c4b208e5-kube-api-access-hmbcz" (OuterVolumeSpecName: "kube-api-access-hmbcz") pod "2e835834-d2ca-414a-b567-8364c4b208e5" (UID: "2e835834-d2ca-414a-b567-8364c4b208e5"). InnerVolumeSpecName "kube-api-access-hmbcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.303198 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-inventory" (OuterVolumeSpecName: "inventory") pod "2e835834-d2ca-414a-b567-8364c4b208e5" (UID: "2e835834-d2ca-414a-b567-8364c4b208e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.304358 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "2e835834-d2ca-414a-b567-8364c4b208e5" (UID: "2e835834-d2ca-414a-b567-8364c4b208e5"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.303322 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "2e835834-d2ca-414a-b567-8364c4b208e5" (UID: "2e835834-d2ca-414a-b567-8364c4b208e5"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.317223 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2e835834-d2ca-414a-b567-8364c4b208e5" (UID: "2e835834-d2ca-414a-b567-8364c4b208e5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.357062 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.357118 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.357128 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmbcz\" (UniqueName: \"kubernetes.io/projected/2e835834-d2ca-414a-b567-8364c4b208e5-kube-api-access-hmbcz\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.357137 4792 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.357147 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.357156 4792 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.357166 4792 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2e835834-d2ca-414a-b567-8364c4b208e5-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.658947 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" event={"ID":"2e835834-d2ca-414a-b567-8364c4b208e5","Type":"ContainerDied","Data":"6e6f3b80f519350c01f44fca82efcc70a1dea06f7dbdf4ff19f2d68da411c99c"} Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.659002 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e6f3b80f519350c01f44fca82efcc70a1dea06f7dbdf4ff19f2d68da411c99c" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.659050 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.768561 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54"] Mar 09 09:53:23 crc kubenswrapper[4792]: E0309 09:53:23.768992 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91196eb5-73f9-45bc-815e-deab952b3d26" containerName="registry-server" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.769015 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="91196eb5-73f9-45bc-815e-deab952b3d26" containerName="registry-server" Mar 09 09:53:23 crc kubenswrapper[4792]: E0309 09:53:23.769045 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91196eb5-73f9-45bc-815e-deab952b3d26" containerName="extract-content" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.769054 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="91196eb5-73f9-45bc-815e-deab952b3d26" containerName="extract-content" Mar 09 09:53:23 crc kubenswrapper[4792]: E0309 09:53:23.769086 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91196eb5-73f9-45bc-815e-deab952b3d26" containerName="extract-utilities" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.769096 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="91196eb5-73f9-45bc-815e-deab952b3d26" containerName="extract-utilities" Mar 09 09:53:23 crc kubenswrapper[4792]: E0309 09:53:23.769130 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e835834-d2ca-414a-b567-8364c4b208e5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.769140 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e835834-d2ca-414a-b567-8364c4b208e5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.769341 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="91196eb5-73f9-45bc-815e-deab952b3d26" containerName="registry-server" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.769372 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e835834-d2ca-414a-b567-8364c4b208e5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.770109 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.772227 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.773804 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.773942 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.774099 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.774184 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.775005 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.786992 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54"] Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.869287 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.869476 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.869517 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.869554 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.869587 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.870242 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m56tr\" (UniqueName: \"kubernetes.io/projected/047ab0a5-633d-4731-a534-fd2db3b65b43-kube-api-access-m56tr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.971701 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.972056 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.972099 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.972130 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.972195 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m56tr\" (UniqueName: \"kubernetes.io/projected/047ab0a5-633d-4731-a534-fd2db3b65b43-kube-api-access-m56tr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.972248 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.976544 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.976563 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.978643 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.978645 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.978682 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:23 crc kubenswrapper[4792]: I0309 09:53:23.991042 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m56tr\" (UniqueName: \"kubernetes.io/projected/047ab0a5-633d-4731-a534-fd2db3b65b43-kube-api-access-m56tr\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4sx54\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:24 crc kubenswrapper[4792]: I0309 09:53:24.088230 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:53:24 crc kubenswrapper[4792]: I0309 09:53:24.689503 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54"] Mar 09 09:53:25 crc kubenswrapper[4792]: I0309 09:53:25.679609 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" event={"ID":"047ab0a5-633d-4731-a534-fd2db3b65b43","Type":"ContainerStarted","Data":"2d48236a18b39f4c2fcba615fb26f34b1b175f78da28445890bdbd3645476929"} Mar 09 09:53:25 crc kubenswrapper[4792]: I0309 09:53:25.679858 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" event={"ID":"047ab0a5-633d-4731-a534-fd2db3b65b43","Type":"ContainerStarted","Data":"6fe5c959e3c5d332c87bedfa42a63bcf093c7328d553eec7485f5b53b03b56d1"} Mar 09 09:53:25 crc kubenswrapper[4792]: I0309 09:53:25.726793 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" podStartSLOduration=2.254634768 podStartE2EDuration="2.726647816s" podCreationTimestamp="2026-03-09 09:53:23 +0000 UTC" firstStartedPulling="2026-03-09 09:53:24.725777244 +0000 UTC m=+2769.755977996" lastFinishedPulling="2026-03-09 09:53:25.197790292 +0000 UTC m=+2770.227991044" observedRunningTime="2026-03-09 09:53:25.720734524 +0000 UTC m=+2770.750935276" watchObservedRunningTime="2026-03-09 09:53:25.726647816 +0000 UTC m=+2770.756848568" Mar 09 09:54:00 crc kubenswrapper[4792]: I0309 09:54:00.142246 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550834-fpmjc"] Mar 09 09:54:00 crc kubenswrapper[4792]: I0309 09:54:00.144038 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550834-fpmjc" Mar 09 09:54:00 crc kubenswrapper[4792]: I0309 09:54:00.150698 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:54:00 crc kubenswrapper[4792]: I0309 09:54:00.153375 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:54:00 crc kubenswrapper[4792]: I0309 09:54:00.153376 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:54:00 crc kubenswrapper[4792]: I0309 09:54:00.160364 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550834-fpmjc"] Mar 09 09:54:00 crc kubenswrapper[4792]: I0309 09:54:00.300897 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tws5z\" (UniqueName: \"kubernetes.io/projected/fc38964a-cf56-41dd-8e12-2897e356db7a-kube-api-access-tws5z\") pod \"auto-csr-approver-29550834-fpmjc\" (UID: \"fc38964a-cf56-41dd-8e12-2897e356db7a\") " pod="openshift-infra/auto-csr-approver-29550834-fpmjc" Mar 09 09:54:00 crc kubenswrapper[4792]: I0309 09:54:00.403010 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tws5z\" (UniqueName: \"kubernetes.io/projected/fc38964a-cf56-41dd-8e12-2897e356db7a-kube-api-access-tws5z\") pod \"auto-csr-approver-29550834-fpmjc\" (UID: \"fc38964a-cf56-41dd-8e12-2897e356db7a\") " pod="openshift-infra/auto-csr-approver-29550834-fpmjc" Mar 09 09:54:00 crc kubenswrapper[4792]: I0309 09:54:00.421311 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tws5z\" (UniqueName: \"kubernetes.io/projected/fc38964a-cf56-41dd-8e12-2897e356db7a-kube-api-access-tws5z\") pod \"auto-csr-approver-29550834-fpmjc\" (UID: \"fc38964a-cf56-41dd-8e12-2897e356db7a\") " pod="openshift-infra/auto-csr-approver-29550834-fpmjc" Mar 09 09:54:00 crc kubenswrapper[4792]: I0309 09:54:00.467945 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550834-fpmjc" Mar 09 09:54:00 crc kubenswrapper[4792]: I0309 09:54:00.945698 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550834-fpmjc"] Mar 09 09:54:00 crc kubenswrapper[4792]: I0309 09:54:00.981038 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550834-fpmjc" event={"ID":"fc38964a-cf56-41dd-8e12-2897e356db7a","Type":"ContainerStarted","Data":"ddef4631dbe5261c68415dbda87c7fbdc4dd4eaba5b02fb921c7adf2e9e69fb8"} Mar 09 09:54:03 crc kubenswrapper[4792]: I0309 09:54:03.039826 4792 generic.go:334] "Generic (PLEG): container finished" podID="fc38964a-cf56-41dd-8e12-2897e356db7a" containerID="062349ef731f6f884f801e39e8e2a755629822568a0b2473e1c61a28b2fdc96e" exitCode=0 Mar 09 09:54:03 crc kubenswrapper[4792]: I0309 09:54:03.039919 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550834-fpmjc" event={"ID":"fc38964a-cf56-41dd-8e12-2897e356db7a","Type":"ContainerDied","Data":"062349ef731f6f884f801e39e8e2a755629822568a0b2473e1c61a28b2fdc96e"} Mar 09 09:54:04 crc kubenswrapper[4792]: I0309 09:54:04.444905 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550834-fpmjc" Mar 09 09:54:04 crc kubenswrapper[4792]: I0309 09:54:04.594643 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tws5z\" (UniqueName: \"kubernetes.io/projected/fc38964a-cf56-41dd-8e12-2897e356db7a-kube-api-access-tws5z\") pod \"fc38964a-cf56-41dd-8e12-2897e356db7a\" (UID: \"fc38964a-cf56-41dd-8e12-2897e356db7a\") " Mar 09 09:54:04 crc kubenswrapper[4792]: I0309 09:54:04.602307 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc38964a-cf56-41dd-8e12-2897e356db7a-kube-api-access-tws5z" (OuterVolumeSpecName: "kube-api-access-tws5z") pod "fc38964a-cf56-41dd-8e12-2897e356db7a" (UID: "fc38964a-cf56-41dd-8e12-2897e356db7a"). InnerVolumeSpecName "kube-api-access-tws5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:04 crc kubenswrapper[4792]: I0309 09:54:04.696769 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tws5z\" (UniqueName: \"kubernetes.io/projected/fc38964a-cf56-41dd-8e12-2897e356db7a-kube-api-access-tws5z\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:05 crc kubenswrapper[4792]: I0309 09:54:05.060152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550834-fpmjc" event={"ID":"fc38964a-cf56-41dd-8e12-2897e356db7a","Type":"ContainerDied","Data":"ddef4631dbe5261c68415dbda87c7fbdc4dd4eaba5b02fb921c7adf2e9e69fb8"} Mar 09 09:54:05 crc kubenswrapper[4792]: I0309 09:54:05.060473 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddef4631dbe5261c68415dbda87c7fbdc4dd4eaba5b02fb921c7adf2e9e69fb8" Mar 09 09:54:05 crc kubenswrapper[4792]: I0309 09:54:05.060262 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550834-fpmjc" Mar 09 09:54:05 crc kubenswrapper[4792]: I0309 09:54:05.515313 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550828-c7c44"] Mar 09 09:54:05 crc kubenswrapper[4792]: I0309 09:54:05.522663 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550828-c7c44"] Mar 09 09:54:05 crc kubenswrapper[4792]: I0309 09:54:05.674138 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13d484cf-4b10-4b25-978f-8149fd45aa5b" path="/var/lib/kubelet/pods/13d484cf-4b10-4b25-978f-8149fd45aa5b/volumes" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.606461 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mfrdf"] Mar 09 09:54:34 crc kubenswrapper[4792]: E0309 09:54:34.607632 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc38964a-cf56-41dd-8e12-2897e356db7a" containerName="oc" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.607645 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc38964a-cf56-41dd-8e12-2897e356db7a" containerName="oc" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.607830 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc38964a-cf56-41dd-8e12-2897e356db7a" containerName="oc" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.609121 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.620007 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mfrdf"] Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.676448 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3e71be5-52c4-44a0-91bf-90dd18b486a6-catalog-content\") pod \"community-operators-mfrdf\" (UID: \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\") " pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.676531 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3e71be5-52c4-44a0-91bf-90dd18b486a6-utilities\") pod \"community-operators-mfrdf\" (UID: \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\") " pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.676819 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc5cl\" (UniqueName: \"kubernetes.io/projected/c3e71be5-52c4-44a0-91bf-90dd18b486a6-kube-api-access-qc5cl\") pod \"community-operators-mfrdf\" (UID: \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\") " pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.778179 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc5cl\" (UniqueName: \"kubernetes.io/projected/c3e71be5-52c4-44a0-91bf-90dd18b486a6-kube-api-access-qc5cl\") pod \"community-operators-mfrdf\" (UID: \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\") " pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.778277 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3e71be5-52c4-44a0-91bf-90dd18b486a6-catalog-content\") pod \"community-operators-mfrdf\" (UID: \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\") " pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.778719 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3e71be5-52c4-44a0-91bf-90dd18b486a6-utilities\") pod \"community-operators-mfrdf\" (UID: \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\") " pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.779599 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3e71be5-52c4-44a0-91bf-90dd18b486a6-utilities\") pod \"community-operators-mfrdf\" (UID: \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\") " pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.780271 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3e71be5-52c4-44a0-91bf-90dd18b486a6-catalog-content\") pod \"community-operators-mfrdf\" (UID: \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\") " pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.804352 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc5cl\" (UniqueName: \"kubernetes.io/projected/c3e71be5-52c4-44a0-91bf-90dd18b486a6-kube-api-access-qc5cl\") pod \"community-operators-mfrdf\" (UID: \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\") " pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:34 crc kubenswrapper[4792]: I0309 09:54:34.940175 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:35 crc kubenswrapper[4792]: I0309 09:54:35.466389 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mfrdf"] Mar 09 09:54:36 crc kubenswrapper[4792]: I0309 09:54:36.313318 4792 generic.go:334] "Generic (PLEG): container finished" podID="c3e71be5-52c4-44a0-91bf-90dd18b486a6" containerID="fc571a2e060ecc0d884904a590b410a0d1359e8f6484b1bc1ee4a5579dfb1ae9" exitCode=0 Mar 09 09:54:36 crc kubenswrapper[4792]: I0309 09:54:36.313366 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfrdf" event={"ID":"c3e71be5-52c4-44a0-91bf-90dd18b486a6","Type":"ContainerDied","Data":"fc571a2e060ecc0d884904a590b410a0d1359e8f6484b1bc1ee4a5579dfb1ae9"} Mar 09 09:54:36 crc kubenswrapper[4792]: I0309 09:54:36.313840 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfrdf" event={"ID":"c3e71be5-52c4-44a0-91bf-90dd18b486a6","Type":"ContainerStarted","Data":"18631b42e6be3196fd06ce605f4da41f14a6380a64118d5ff14e4eba7a2c07d8"} Mar 09 09:54:37 crc kubenswrapper[4792]: I0309 09:54:37.325059 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfrdf" event={"ID":"c3e71be5-52c4-44a0-91bf-90dd18b486a6","Type":"ContainerStarted","Data":"76678784818fd46aebf19f3bdc0d02b38648a6c7ef64746c6f6b0749b704153b"} Mar 09 09:54:39 crc kubenswrapper[4792]: I0309 09:54:39.345211 4792 generic.go:334] "Generic (PLEG): container finished" podID="c3e71be5-52c4-44a0-91bf-90dd18b486a6" containerID="76678784818fd46aebf19f3bdc0d02b38648a6c7ef64746c6f6b0749b704153b" exitCode=0 Mar 09 09:54:39 crc kubenswrapper[4792]: I0309 09:54:39.345253 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfrdf" event={"ID":"c3e71be5-52c4-44a0-91bf-90dd18b486a6","Type":"ContainerDied","Data":"76678784818fd46aebf19f3bdc0d02b38648a6c7ef64746c6f6b0749b704153b"} Mar 09 09:54:40 crc kubenswrapper[4792]: I0309 09:54:40.135662 4792 scope.go:117] "RemoveContainer" containerID="c8eaf420631c9857a7de5ece3313e25cf0ae9629e901d3a967bbcdd325f9e344" Mar 09 09:54:40 crc kubenswrapper[4792]: I0309 09:54:40.356331 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfrdf" event={"ID":"c3e71be5-52c4-44a0-91bf-90dd18b486a6","Type":"ContainerStarted","Data":"13c0a621813a95e9054d1985e71d9827ac9f1ee46bf6bbb53e5e6113ebd9953a"} Mar 09 09:54:40 crc kubenswrapper[4792]: I0309 09:54:40.378201 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mfrdf" podStartSLOduration=2.9192759710000002 podStartE2EDuration="6.378180837s" podCreationTimestamp="2026-03-09 09:54:34 +0000 UTC" firstStartedPulling="2026-03-09 09:54:36.314725688 +0000 UTC m=+2841.344926440" lastFinishedPulling="2026-03-09 09:54:39.773630554 +0000 UTC m=+2844.803831306" observedRunningTime="2026-03-09 09:54:40.37425246 +0000 UTC m=+2845.404453222" watchObservedRunningTime="2026-03-09 09:54:40.378180837 +0000 UTC m=+2845.408381589" Mar 09 09:54:44 crc kubenswrapper[4792]: I0309 09:54:44.940326 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:44 crc kubenswrapper[4792]: I0309 09:54:44.940964 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:44 crc kubenswrapper[4792]: I0309 09:54:44.988003 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:45 crc kubenswrapper[4792]: I0309 09:54:45.447155 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:45 crc kubenswrapper[4792]: I0309 09:54:45.495677 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mfrdf"] Mar 09 09:54:47 crc kubenswrapper[4792]: I0309 09:54:47.414502 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mfrdf" podUID="c3e71be5-52c4-44a0-91bf-90dd18b486a6" containerName="registry-server" containerID="cri-o://13c0a621813a95e9054d1985e71d9827ac9f1ee46bf6bbb53e5e6113ebd9953a" gracePeriod=2 Mar 09 09:54:47 crc kubenswrapper[4792]: I0309 09:54:47.888337 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.014964 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3e71be5-52c4-44a0-91bf-90dd18b486a6-catalog-content\") pod \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\" (UID: \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\") " Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.015281 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc5cl\" (UniqueName: \"kubernetes.io/projected/c3e71be5-52c4-44a0-91bf-90dd18b486a6-kube-api-access-qc5cl\") pod \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\" (UID: \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\") " Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.015311 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3e71be5-52c4-44a0-91bf-90dd18b486a6-utilities\") pod \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\" (UID: \"c3e71be5-52c4-44a0-91bf-90dd18b486a6\") " Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.016359 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3e71be5-52c4-44a0-91bf-90dd18b486a6-utilities" (OuterVolumeSpecName: "utilities") pod "c3e71be5-52c4-44a0-91bf-90dd18b486a6" (UID: "c3e71be5-52c4-44a0-91bf-90dd18b486a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.022388 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3e71be5-52c4-44a0-91bf-90dd18b486a6-kube-api-access-qc5cl" (OuterVolumeSpecName: "kube-api-access-qc5cl") pod "c3e71be5-52c4-44a0-91bf-90dd18b486a6" (UID: "c3e71be5-52c4-44a0-91bf-90dd18b486a6"). InnerVolumeSpecName "kube-api-access-qc5cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.073919 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3e71be5-52c4-44a0-91bf-90dd18b486a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3e71be5-52c4-44a0-91bf-90dd18b486a6" (UID: "c3e71be5-52c4-44a0-91bf-90dd18b486a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.117308 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc5cl\" (UniqueName: \"kubernetes.io/projected/c3e71be5-52c4-44a0-91bf-90dd18b486a6-kube-api-access-qc5cl\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.117350 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3e71be5-52c4-44a0-91bf-90dd18b486a6-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.117364 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3e71be5-52c4-44a0-91bf-90dd18b486a6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.427145 4792 generic.go:334] "Generic (PLEG): container finished" podID="c3e71be5-52c4-44a0-91bf-90dd18b486a6" containerID="13c0a621813a95e9054d1985e71d9827ac9f1ee46bf6bbb53e5e6113ebd9953a" exitCode=0 Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.427218 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfrdf" event={"ID":"c3e71be5-52c4-44a0-91bf-90dd18b486a6","Type":"ContainerDied","Data":"13c0a621813a95e9054d1985e71d9827ac9f1ee46bf6bbb53e5e6113ebd9953a"} Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.427280 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mfrdf" event={"ID":"c3e71be5-52c4-44a0-91bf-90dd18b486a6","Type":"ContainerDied","Data":"18631b42e6be3196fd06ce605f4da41f14a6380a64118d5ff14e4eba7a2c07d8"} Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.427288 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mfrdf" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.427302 4792 scope.go:117] "RemoveContainer" containerID="13c0a621813a95e9054d1985e71d9827ac9f1ee46bf6bbb53e5e6113ebd9953a" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.447992 4792 scope.go:117] "RemoveContainer" containerID="76678784818fd46aebf19f3bdc0d02b38648a6c7ef64746c6f6b0749b704153b" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.478485 4792 scope.go:117] "RemoveContainer" containerID="fc571a2e060ecc0d884904a590b410a0d1359e8f6484b1bc1ee4a5579dfb1ae9" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.480357 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mfrdf"] Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.491226 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mfrdf"] Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.511497 4792 scope.go:117] "RemoveContainer" containerID="13c0a621813a95e9054d1985e71d9827ac9f1ee46bf6bbb53e5e6113ebd9953a" Mar 09 09:54:48 crc kubenswrapper[4792]: E0309 09:54:48.511918 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13c0a621813a95e9054d1985e71d9827ac9f1ee46bf6bbb53e5e6113ebd9953a\": container with ID starting with 13c0a621813a95e9054d1985e71d9827ac9f1ee46bf6bbb53e5e6113ebd9953a not found: ID does not exist" containerID="13c0a621813a95e9054d1985e71d9827ac9f1ee46bf6bbb53e5e6113ebd9953a" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.511960 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13c0a621813a95e9054d1985e71d9827ac9f1ee46bf6bbb53e5e6113ebd9953a"} err="failed to get container status \"13c0a621813a95e9054d1985e71d9827ac9f1ee46bf6bbb53e5e6113ebd9953a\": rpc error: code = NotFound desc = could not find container \"13c0a621813a95e9054d1985e71d9827ac9f1ee46bf6bbb53e5e6113ebd9953a\": container with ID starting with 13c0a621813a95e9054d1985e71d9827ac9f1ee46bf6bbb53e5e6113ebd9953a not found: ID does not exist" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.511983 4792 scope.go:117] "RemoveContainer" containerID="76678784818fd46aebf19f3bdc0d02b38648a6c7ef64746c6f6b0749b704153b" Mar 09 09:54:48 crc kubenswrapper[4792]: E0309 09:54:48.512504 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76678784818fd46aebf19f3bdc0d02b38648a6c7ef64746c6f6b0749b704153b\": container with ID starting with 76678784818fd46aebf19f3bdc0d02b38648a6c7ef64746c6f6b0749b704153b not found: ID does not exist" containerID="76678784818fd46aebf19f3bdc0d02b38648a6c7ef64746c6f6b0749b704153b" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.512539 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76678784818fd46aebf19f3bdc0d02b38648a6c7ef64746c6f6b0749b704153b"} err="failed to get container status \"76678784818fd46aebf19f3bdc0d02b38648a6c7ef64746c6f6b0749b704153b\": rpc error: code = NotFound desc = could not find container \"76678784818fd46aebf19f3bdc0d02b38648a6c7ef64746c6f6b0749b704153b\": container with ID starting with 76678784818fd46aebf19f3bdc0d02b38648a6c7ef64746c6f6b0749b704153b not found: ID does not exist" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.512557 4792 scope.go:117] "RemoveContainer" containerID="fc571a2e060ecc0d884904a590b410a0d1359e8f6484b1bc1ee4a5579dfb1ae9" Mar 09 09:54:48 crc kubenswrapper[4792]: E0309 09:54:48.512800 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc571a2e060ecc0d884904a590b410a0d1359e8f6484b1bc1ee4a5579dfb1ae9\": container with ID starting with fc571a2e060ecc0d884904a590b410a0d1359e8f6484b1bc1ee4a5579dfb1ae9 not found: ID does not exist" containerID="fc571a2e060ecc0d884904a590b410a0d1359e8f6484b1bc1ee4a5579dfb1ae9" Mar 09 09:54:48 crc kubenswrapper[4792]: I0309 09:54:48.512833 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc571a2e060ecc0d884904a590b410a0d1359e8f6484b1bc1ee4a5579dfb1ae9"} err="failed to get container status \"fc571a2e060ecc0d884904a590b410a0d1359e8f6484b1bc1ee4a5579dfb1ae9\": rpc error: code = NotFound desc = could not find container \"fc571a2e060ecc0d884904a590b410a0d1359e8f6484b1bc1ee4a5579dfb1ae9\": container with ID starting with fc571a2e060ecc0d884904a590b410a0d1359e8f6484b1bc1ee4a5579dfb1ae9 not found: ID does not exist" Mar 09 09:54:49 crc kubenswrapper[4792]: I0309 09:54:49.676032 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3e71be5-52c4-44a0-91bf-90dd18b486a6" path="/var/lib/kubelet/pods/c3e71be5-52c4-44a0-91bf-90dd18b486a6/volumes" Mar 09 09:55:13 crc kubenswrapper[4792]: I0309 09:55:13.213701 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:55:13 crc kubenswrapper[4792]: I0309 09:55:13.214264 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:55:43 crc kubenswrapper[4792]: I0309 09:55:43.213868 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:55:43 crc kubenswrapper[4792]: I0309 09:55:43.214494 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.159030 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550836-6xrj2"] Mar 09 09:56:00 crc kubenswrapper[4792]: E0309 09:56:00.160841 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3e71be5-52c4-44a0-91bf-90dd18b486a6" containerName="registry-server" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.160867 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3e71be5-52c4-44a0-91bf-90dd18b486a6" containerName="registry-server" Mar 09 09:56:00 crc kubenswrapper[4792]: E0309 09:56:00.160909 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3e71be5-52c4-44a0-91bf-90dd18b486a6" containerName="extract-utilities" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.160917 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3e71be5-52c4-44a0-91bf-90dd18b486a6" containerName="extract-utilities" Mar 09 09:56:00 crc kubenswrapper[4792]: E0309 09:56:00.160935 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3e71be5-52c4-44a0-91bf-90dd18b486a6" containerName="extract-content" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.160942 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3e71be5-52c4-44a0-91bf-90dd18b486a6" containerName="extract-content" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.161207 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3e71be5-52c4-44a0-91bf-90dd18b486a6" containerName="registry-server" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.162239 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550836-6xrj2" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.166194 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.166546 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.166986 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.172358 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550836-6xrj2"] Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.287602 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krb8g\" (UniqueName: \"kubernetes.io/projected/5108dd62-354b-4ee4-98d2-0a48df21d76c-kube-api-access-krb8g\") pod \"auto-csr-approver-29550836-6xrj2\" (UID: \"5108dd62-354b-4ee4-98d2-0a48df21d76c\") " pod="openshift-infra/auto-csr-approver-29550836-6xrj2" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.389236 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krb8g\" (UniqueName: \"kubernetes.io/projected/5108dd62-354b-4ee4-98d2-0a48df21d76c-kube-api-access-krb8g\") pod \"auto-csr-approver-29550836-6xrj2\" (UID: \"5108dd62-354b-4ee4-98d2-0a48df21d76c\") " pod="openshift-infra/auto-csr-approver-29550836-6xrj2" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.419108 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krb8g\" (UniqueName: \"kubernetes.io/projected/5108dd62-354b-4ee4-98d2-0a48df21d76c-kube-api-access-krb8g\") pod \"auto-csr-approver-29550836-6xrj2\" (UID: \"5108dd62-354b-4ee4-98d2-0a48df21d76c\") " pod="openshift-infra/auto-csr-approver-29550836-6xrj2" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.491123 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550836-6xrj2" Mar 09 09:56:00 crc kubenswrapper[4792]: I0309 09:56:00.967576 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550836-6xrj2"] Mar 09 09:56:01 crc kubenswrapper[4792]: I0309 09:56:01.022347 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550836-6xrj2" event={"ID":"5108dd62-354b-4ee4-98d2-0a48df21d76c","Type":"ContainerStarted","Data":"83fed1f8bdb2d3fbf2361493a9adb174b56d77b94875c79267bd4f9101c1518d"} Mar 09 09:56:03 crc kubenswrapper[4792]: I0309 09:56:03.045233 4792 generic.go:334] "Generic (PLEG): container finished" podID="5108dd62-354b-4ee4-98d2-0a48df21d76c" containerID="addaa2d57488cf2486de334fe002b7d168ddce48454fbcd434735a6ef143ac9d" exitCode=0 Mar 09 09:56:03 crc kubenswrapper[4792]: I0309 09:56:03.045334 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550836-6xrj2" event={"ID":"5108dd62-354b-4ee4-98d2-0a48df21d76c","Type":"ContainerDied","Data":"addaa2d57488cf2486de334fe002b7d168ddce48454fbcd434735a6ef143ac9d"} Mar 09 09:56:04 crc kubenswrapper[4792]: I0309 09:56:04.413456 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550836-6xrj2" Mar 09 09:56:04 crc kubenswrapper[4792]: I0309 09:56:04.589719 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krb8g\" (UniqueName: \"kubernetes.io/projected/5108dd62-354b-4ee4-98d2-0a48df21d76c-kube-api-access-krb8g\") pod \"5108dd62-354b-4ee4-98d2-0a48df21d76c\" (UID: \"5108dd62-354b-4ee4-98d2-0a48df21d76c\") " Mar 09 09:56:04 crc kubenswrapper[4792]: I0309 09:56:04.596343 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5108dd62-354b-4ee4-98d2-0a48df21d76c-kube-api-access-krb8g" (OuterVolumeSpecName: "kube-api-access-krb8g") pod "5108dd62-354b-4ee4-98d2-0a48df21d76c" (UID: "5108dd62-354b-4ee4-98d2-0a48df21d76c"). InnerVolumeSpecName "kube-api-access-krb8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:04 crc kubenswrapper[4792]: I0309 09:56:04.694444 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krb8g\" (UniqueName: \"kubernetes.io/projected/5108dd62-354b-4ee4-98d2-0a48df21d76c-kube-api-access-krb8g\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:05 crc kubenswrapper[4792]: I0309 09:56:05.060553 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550836-6xrj2" event={"ID":"5108dd62-354b-4ee4-98d2-0a48df21d76c","Type":"ContainerDied","Data":"83fed1f8bdb2d3fbf2361493a9adb174b56d77b94875c79267bd4f9101c1518d"} Mar 09 09:56:05 crc kubenswrapper[4792]: I0309 09:56:05.060594 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83fed1f8bdb2d3fbf2361493a9adb174b56d77b94875c79267bd4f9101c1518d" Mar 09 09:56:05 crc kubenswrapper[4792]: I0309 09:56:05.060567 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550836-6xrj2" Mar 09 09:56:05 crc kubenswrapper[4792]: I0309 09:56:05.494094 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550830-jwzpt"] Mar 09 09:56:05 crc kubenswrapper[4792]: I0309 09:56:05.504764 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550830-jwzpt"] Mar 09 09:56:05 crc kubenswrapper[4792]: I0309 09:56:05.676119 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e64c4eca-1bc2-4122-b966-cf8709fdf457" path="/var/lib/kubelet/pods/e64c4eca-1bc2-4122-b966-cf8709fdf457/volumes" Mar 09 09:56:13 crc kubenswrapper[4792]: I0309 09:56:13.214197 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:56:13 crc kubenswrapper[4792]: I0309 09:56:13.214780 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:56:13 crc kubenswrapper[4792]: I0309 09:56:13.214827 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 09:56:13 crc kubenswrapper[4792]: I0309 09:56:13.215601 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:56:13 crc kubenswrapper[4792]: I0309 09:56:13.215663 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" gracePeriod=600 Mar 09 09:56:13 crc kubenswrapper[4792]: E0309 09:56:13.334689 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:56:14 crc kubenswrapper[4792]: I0309 09:56:14.130389 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" exitCode=0 Mar 09 09:56:14 crc kubenswrapper[4792]: I0309 09:56:14.130437 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93"} Mar 09 09:56:14 crc kubenswrapper[4792]: I0309 09:56:14.130473 4792 scope.go:117] "RemoveContainer" containerID="862129b30016327d77e586feefb2263203c7afdd585cf1571fd31502117efa31" Mar 09 09:56:14 crc kubenswrapper[4792]: I0309 09:56:14.131055 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:56:14 crc kubenswrapper[4792]: E0309 09:56:14.131388 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:56:28 crc kubenswrapper[4792]: I0309 09:56:28.662327 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:56:28 crc kubenswrapper[4792]: E0309 09:56:28.663090 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:56:40 crc kubenswrapper[4792]: I0309 09:56:40.242933 4792 scope.go:117] "RemoveContainer" containerID="34b65862816b457534aeb66a742e8c75aa34907a5ac5adf505d2e248bbfa9f87" Mar 09 09:56:41 crc kubenswrapper[4792]: I0309 09:56:41.662762 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:56:41 crc kubenswrapper[4792]: E0309 09:56:41.663427 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:56:55 crc kubenswrapper[4792]: I0309 09:56:55.668890 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:56:55 crc kubenswrapper[4792]: E0309 09:56:55.669749 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:56:58 crc kubenswrapper[4792]: I0309 09:56:58.897627 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t5xwc"] Mar 09 09:56:58 crc kubenswrapper[4792]: E0309 09:56:58.898657 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5108dd62-354b-4ee4-98d2-0a48df21d76c" containerName="oc" Mar 09 09:56:58 crc kubenswrapper[4792]: I0309 09:56:58.898675 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="5108dd62-354b-4ee4-98d2-0a48df21d76c" containerName="oc" Mar 09 09:56:58 crc kubenswrapper[4792]: I0309 09:56:58.898885 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="5108dd62-354b-4ee4-98d2-0a48df21d76c" containerName="oc" Mar 09 09:56:58 crc kubenswrapper[4792]: I0309 09:56:58.900577 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:56:58 crc kubenswrapper[4792]: I0309 09:56:58.915535 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5xwc"] Mar 09 09:56:59 crc kubenswrapper[4792]: I0309 09:56:59.033486 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh95k\" (UniqueName: \"kubernetes.io/projected/b2abf489-509d-489a-b7ee-3af10c7c5a4b-kube-api-access-sh95k\") pod \"redhat-marketplace-t5xwc\" (UID: \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\") " pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:56:59 crc kubenswrapper[4792]: I0309 09:56:59.033781 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2abf489-509d-489a-b7ee-3af10c7c5a4b-utilities\") pod \"redhat-marketplace-t5xwc\" (UID: \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\") " pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:56:59 crc kubenswrapper[4792]: I0309 09:56:59.033955 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2abf489-509d-489a-b7ee-3af10c7c5a4b-catalog-content\") pod \"redhat-marketplace-t5xwc\" (UID: \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\") " pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:56:59 crc kubenswrapper[4792]: I0309 09:56:59.136278 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh95k\" (UniqueName: \"kubernetes.io/projected/b2abf489-509d-489a-b7ee-3af10c7c5a4b-kube-api-access-sh95k\") pod \"redhat-marketplace-t5xwc\" (UID: \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\") " pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:56:59 crc kubenswrapper[4792]: I0309 09:56:59.136403 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2abf489-509d-489a-b7ee-3af10c7c5a4b-utilities\") pod \"redhat-marketplace-t5xwc\" (UID: \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\") " pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:56:59 crc kubenswrapper[4792]: I0309 09:56:59.136568 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2abf489-509d-489a-b7ee-3af10c7c5a4b-catalog-content\") pod \"redhat-marketplace-t5xwc\" (UID: \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\") " pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:56:59 crc kubenswrapper[4792]: I0309 09:56:59.136969 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2abf489-509d-489a-b7ee-3af10c7c5a4b-utilities\") pod \"redhat-marketplace-t5xwc\" (UID: \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\") " pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:56:59 crc kubenswrapper[4792]: I0309 09:56:59.137145 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2abf489-509d-489a-b7ee-3af10c7c5a4b-catalog-content\") pod \"redhat-marketplace-t5xwc\" (UID: \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\") " pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:56:59 crc kubenswrapper[4792]: I0309 09:56:59.165014 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh95k\" (UniqueName: \"kubernetes.io/projected/b2abf489-509d-489a-b7ee-3af10c7c5a4b-kube-api-access-sh95k\") pod \"redhat-marketplace-t5xwc\" (UID: \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\") " pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:56:59 crc kubenswrapper[4792]: I0309 09:56:59.266160 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:56:59 crc kubenswrapper[4792]: I0309 09:56:59.808934 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5xwc"] Mar 09 09:57:00 crc kubenswrapper[4792]: I0309 09:57:00.543526 4792 generic.go:334] "Generic (PLEG): container finished" podID="b2abf489-509d-489a-b7ee-3af10c7c5a4b" containerID="696a050624dd73e4afa3cd13e12295f2d37a7fa85e9ebd399ecc352c3edbc50f" exitCode=0 Mar 09 09:57:00 crc kubenswrapper[4792]: I0309 09:57:00.543867 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xwc" event={"ID":"b2abf489-509d-489a-b7ee-3af10c7c5a4b","Type":"ContainerDied","Data":"696a050624dd73e4afa3cd13e12295f2d37a7fa85e9ebd399ecc352c3edbc50f"} Mar 09 09:57:00 crc kubenswrapper[4792]: I0309 09:57:00.543897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xwc" event={"ID":"b2abf489-509d-489a-b7ee-3af10c7c5a4b","Type":"ContainerStarted","Data":"7fe40f1c1c1cb889dbb2e4dc115a74df25ce758d86914db530fe119b40d462bc"} Mar 09 09:57:01 crc kubenswrapper[4792]: I0309 09:57:01.555135 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xwc" event={"ID":"b2abf489-509d-489a-b7ee-3af10c7c5a4b","Type":"ContainerStarted","Data":"b3862217b172bc4c13641b8d883d90cb2ec281f5f3306948d8d8a4428f85cec7"} Mar 09 09:57:02 crc kubenswrapper[4792]: I0309 09:57:02.565189 4792 generic.go:334] "Generic (PLEG): container finished" podID="b2abf489-509d-489a-b7ee-3af10c7c5a4b" containerID="b3862217b172bc4c13641b8d883d90cb2ec281f5f3306948d8d8a4428f85cec7" exitCode=0 Mar 09 09:57:02 crc kubenswrapper[4792]: I0309 09:57:02.565237 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xwc" event={"ID":"b2abf489-509d-489a-b7ee-3af10c7c5a4b","Type":"ContainerDied","Data":"b3862217b172bc4c13641b8d883d90cb2ec281f5f3306948d8d8a4428f85cec7"} Mar 09 09:57:03 crc kubenswrapper[4792]: I0309 09:57:03.583540 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xwc" event={"ID":"b2abf489-509d-489a-b7ee-3af10c7c5a4b","Type":"ContainerStarted","Data":"b5da6b7fccf418bb13a74a4ef781f1dd367c6610eed40848f19d6285274ca415"} Mar 09 09:57:03 crc kubenswrapper[4792]: I0309 09:57:03.603874 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t5xwc" podStartSLOduration=3.172242008 podStartE2EDuration="5.60385501s" podCreationTimestamp="2026-03-09 09:56:58 +0000 UTC" firstStartedPulling="2026-03-09 09:57:00.546081568 +0000 UTC m=+2985.576282320" lastFinishedPulling="2026-03-09 09:57:02.97769457 +0000 UTC m=+2988.007895322" observedRunningTime="2026-03-09 09:57:03.600998553 +0000 UTC m=+2988.631199305" watchObservedRunningTime="2026-03-09 09:57:03.60385501 +0000 UTC m=+2988.634055762" Mar 09 09:57:08 crc kubenswrapper[4792]: I0309 09:57:08.662017 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:57:08 crc kubenswrapper[4792]: E0309 09:57:08.662836 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:57:09 crc kubenswrapper[4792]: I0309 09:57:09.267332 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:57:09 crc kubenswrapper[4792]: I0309 09:57:09.267385 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:57:09 crc kubenswrapper[4792]: I0309 09:57:09.315487 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:57:09 crc kubenswrapper[4792]: I0309 09:57:09.675121 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:57:09 crc kubenswrapper[4792]: I0309 09:57:09.736702 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5xwc"] Mar 09 09:57:11 crc kubenswrapper[4792]: I0309 09:57:11.647452 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t5xwc" podUID="b2abf489-509d-489a-b7ee-3af10c7c5a4b" containerName="registry-server" containerID="cri-o://b5da6b7fccf418bb13a74a4ef781f1dd367c6610eed40848f19d6285274ca415" gracePeriod=2 Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.114215 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.306897 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh95k\" (UniqueName: \"kubernetes.io/projected/b2abf489-509d-489a-b7ee-3af10c7c5a4b-kube-api-access-sh95k\") pod \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\" (UID: \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\") " Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.307005 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2abf489-509d-489a-b7ee-3af10c7c5a4b-utilities\") pod \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\" (UID: \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\") " Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.307194 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2abf489-509d-489a-b7ee-3af10c7c5a4b-catalog-content\") pod \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\" (UID: \"b2abf489-509d-489a-b7ee-3af10c7c5a4b\") " Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.308013 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2abf489-509d-489a-b7ee-3af10c7c5a4b-utilities" (OuterVolumeSpecName: "utilities") pod "b2abf489-509d-489a-b7ee-3af10c7c5a4b" (UID: "b2abf489-509d-489a-b7ee-3af10c7c5a4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.321423 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2abf489-509d-489a-b7ee-3af10c7c5a4b-kube-api-access-sh95k" (OuterVolumeSpecName: "kube-api-access-sh95k") pod "b2abf489-509d-489a-b7ee-3af10c7c5a4b" (UID: "b2abf489-509d-489a-b7ee-3af10c7c5a4b"). InnerVolumeSpecName "kube-api-access-sh95k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.338904 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2abf489-509d-489a-b7ee-3af10c7c5a4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2abf489-509d-489a-b7ee-3af10c7c5a4b" (UID: "b2abf489-509d-489a-b7ee-3af10c7c5a4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.410089 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh95k\" (UniqueName: \"kubernetes.io/projected/b2abf489-509d-489a-b7ee-3af10c7c5a4b-kube-api-access-sh95k\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.410148 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2abf489-509d-489a-b7ee-3af10c7c5a4b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.410160 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2abf489-509d-489a-b7ee-3af10c7c5a4b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.664272 4792 generic.go:334] "Generic (PLEG): container finished" podID="b2abf489-509d-489a-b7ee-3af10c7c5a4b" containerID="b5da6b7fccf418bb13a74a4ef781f1dd367c6610eed40848f19d6285274ca415" exitCode=0 Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.664335 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xwc" event={"ID":"b2abf489-509d-489a-b7ee-3af10c7c5a4b","Type":"ContainerDied","Data":"b5da6b7fccf418bb13a74a4ef781f1dd367c6610eed40848f19d6285274ca415"} Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.664345 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5xwc" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.664379 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5xwc" event={"ID":"b2abf489-509d-489a-b7ee-3af10c7c5a4b","Type":"ContainerDied","Data":"7fe40f1c1c1cb889dbb2e4dc115a74df25ce758d86914db530fe119b40d462bc"} Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.664408 4792 scope.go:117] "RemoveContainer" containerID="b5da6b7fccf418bb13a74a4ef781f1dd367c6610eed40848f19d6285274ca415" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.689644 4792 scope.go:117] "RemoveContainer" containerID="b3862217b172bc4c13641b8d883d90cb2ec281f5f3306948d8d8a4428f85cec7" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.722393 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5xwc"] Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.723744 4792 scope.go:117] "RemoveContainer" containerID="696a050624dd73e4afa3cd13e12295f2d37a7fa85e9ebd399ecc352c3edbc50f" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.738867 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5xwc"] Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.771514 4792 scope.go:117] "RemoveContainer" containerID="b5da6b7fccf418bb13a74a4ef781f1dd367c6610eed40848f19d6285274ca415" Mar 09 09:57:12 crc kubenswrapper[4792]: E0309 09:57:12.777946 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5da6b7fccf418bb13a74a4ef781f1dd367c6610eed40848f19d6285274ca415\": container with ID starting with b5da6b7fccf418bb13a74a4ef781f1dd367c6610eed40848f19d6285274ca415 not found: ID does not exist" containerID="b5da6b7fccf418bb13a74a4ef781f1dd367c6610eed40848f19d6285274ca415" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.778220 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5da6b7fccf418bb13a74a4ef781f1dd367c6610eed40848f19d6285274ca415"} err="failed to get container status \"b5da6b7fccf418bb13a74a4ef781f1dd367c6610eed40848f19d6285274ca415\": rpc error: code = NotFound desc = could not find container \"b5da6b7fccf418bb13a74a4ef781f1dd367c6610eed40848f19d6285274ca415\": container with ID starting with b5da6b7fccf418bb13a74a4ef781f1dd367c6610eed40848f19d6285274ca415 not found: ID does not exist" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.778324 4792 scope.go:117] "RemoveContainer" containerID="b3862217b172bc4c13641b8d883d90cb2ec281f5f3306948d8d8a4428f85cec7" Mar 09 09:57:12 crc kubenswrapper[4792]: E0309 09:57:12.778957 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3862217b172bc4c13641b8d883d90cb2ec281f5f3306948d8d8a4428f85cec7\": container with ID starting with b3862217b172bc4c13641b8d883d90cb2ec281f5f3306948d8d8a4428f85cec7 not found: ID does not exist" containerID="b3862217b172bc4c13641b8d883d90cb2ec281f5f3306948d8d8a4428f85cec7" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.779003 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3862217b172bc4c13641b8d883d90cb2ec281f5f3306948d8d8a4428f85cec7"} err="failed to get container status \"b3862217b172bc4c13641b8d883d90cb2ec281f5f3306948d8d8a4428f85cec7\": rpc error: code = NotFound desc = could not find container \"b3862217b172bc4c13641b8d883d90cb2ec281f5f3306948d8d8a4428f85cec7\": container with ID starting with b3862217b172bc4c13641b8d883d90cb2ec281f5f3306948d8d8a4428f85cec7 not found: ID does not exist" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.779037 4792 scope.go:117] "RemoveContainer" containerID="696a050624dd73e4afa3cd13e12295f2d37a7fa85e9ebd399ecc352c3edbc50f" Mar 09 09:57:12 crc kubenswrapper[4792]: E0309 09:57:12.779351 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"696a050624dd73e4afa3cd13e12295f2d37a7fa85e9ebd399ecc352c3edbc50f\": container with ID starting with 696a050624dd73e4afa3cd13e12295f2d37a7fa85e9ebd399ecc352c3edbc50f not found: ID does not exist" containerID="696a050624dd73e4afa3cd13e12295f2d37a7fa85e9ebd399ecc352c3edbc50f" Mar 09 09:57:12 crc kubenswrapper[4792]: I0309 09:57:12.779381 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"696a050624dd73e4afa3cd13e12295f2d37a7fa85e9ebd399ecc352c3edbc50f"} err="failed to get container status \"696a050624dd73e4afa3cd13e12295f2d37a7fa85e9ebd399ecc352c3edbc50f\": rpc error: code = NotFound desc = could not find container \"696a050624dd73e4afa3cd13e12295f2d37a7fa85e9ebd399ecc352c3edbc50f\": container with ID starting with 696a050624dd73e4afa3cd13e12295f2d37a7fa85e9ebd399ecc352c3edbc50f not found: ID does not exist" Mar 09 09:57:13 crc kubenswrapper[4792]: I0309 09:57:13.673624 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2abf489-509d-489a-b7ee-3af10c7c5a4b" path="/var/lib/kubelet/pods/b2abf489-509d-489a-b7ee-3af10c7c5a4b/volumes" Mar 09 09:57:23 crc kubenswrapper[4792]: I0309 09:57:23.663310 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:57:23 crc kubenswrapper[4792]: E0309 09:57:23.664203 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:57:34 crc kubenswrapper[4792]: I0309 09:57:34.837645 4792 generic.go:334] "Generic (PLEG): container finished" podID="047ab0a5-633d-4731-a534-fd2db3b65b43" containerID="2d48236a18b39f4c2fcba615fb26f34b1b175f78da28445890bdbd3645476929" exitCode=0 Mar 09 09:57:34 crc kubenswrapper[4792]: I0309 09:57:34.837740 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" event={"ID":"047ab0a5-633d-4731-a534-fd2db3b65b43","Type":"ContainerDied","Data":"2d48236a18b39f4c2fcba615fb26f34b1b175f78da28445890bdbd3645476929"} Mar 09 09:57:35 crc kubenswrapper[4792]: I0309 09:57:35.662915 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:57:35 crc kubenswrapper[4792]: E0309 09:57:35.663210 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.329675 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.460239 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-libvirt-secret-0\") pod \"047ab0a5-633d-4731-a534-fd2db3b65b43\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.460619 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-libvirt-combined-ca-bundle\") pod \"047ab0a5-633d-4731-a534-fd2db3b65b43\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.460686 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-ssh-key-openstack-edpm-ipam\") pod \"047ab0a5-633d-4731-a534-fd2db3b65b43\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.460722 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-inventory\") pod \"047ab0a5-633d-4731-a534-fd2db3b65b43\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.460775 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m56tr\" (UniqueName: \"kubernetes.io/projected/047ab0a5-633d-4731-a534-fd2db3b65b43-kube-api-access-m56tr\") pod \"047ab0a5-633d-4731-a534-fd2db3b65b43\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.461521 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-ceph\") pod \"047ab0a5-633d-4731-a534-fd2db3b65b43\" (UID: \"047ab0a5-633d-4731-a534-fd2db3b65b43\") " Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.468044 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "047ab0a5-633d-4731-a534-fd2db3b65b43" (UID: "047ab0a5-633d-4731-a534-fd2db3b65b43"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.481313 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-ceph" (OuterVolumeSpecName: "ceph") pod "047ab0a5-633d-4731-a534-fd2db3b65b43" (UID: "047ab0a5-633d-4731-a534-fd2db3b65b43"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.485446 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047ab0a5-633d-4731-a534-fd2db3b65b43-kube-api-access-m56tr" (OuterVolumeSpecName: "kube-api-access-m56tr") pod "047ab0a5-633d-4731-a534-fd2db3b65b43" (UID: "047ab0a5-633d-4731-a534-fd2db3b65b43"). InnerVolumeSpecName "kube-api-access-m56tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.492473 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-inventory" (OuterVolumeSpecName: "inventory") pod "047ab0a5-633d-4731-a534-fd2db3b65b43" (UID: "047ab0a5-633d-4731-a534-fd2db3b65b43"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.494238 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "047ab0a5-633d-4731-a534-fd2db3b65b43" (UID: "047ab0a5-633d-4731-a534-fd2db3b65b43"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.494651 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "047ab0a5-633d-4731-a534-fd2db3b65b43" (UID: "047ab0a5-633d-4731-a534-fd2db3b65b43"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.564816 4792 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.564885 4792 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.564900 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.564912 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.564924 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m56tr\" (UniqueName: \"kubernetes.io/projected/047ab0a5-633d-4731-a534-fd2db3b65b43-kube-api-access-m56tr\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.564934 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/047ab0a5-633d-4731-a534-fd2db3b65b43-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.856267 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" event={"ID":"047ab0a5-633d-4731-a534-fd2db3b65b43","Type":"ContainerDied","Data":"6fe5c959e3c5d332c87bedfa42a63bcf093c7328d553eec7485f5b53b03b56d1"} Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.856327 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fe5c959e3c5d332c87bedfa42a63bcf093c7328d553eec7485f5b53b03b56d1" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.856366 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4sx54" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.982034 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr"] Mar 09 09:57:36 crc kubenswrapper[4792]: E0309 09:57:36.982478 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2abf489-509d-489a-b7ee-3af10c7c5a4b" containerName="extract-content" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.982501 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2abf489-509d-489a-b7ee-3af10c7c5a4b" containerName="extract-content" Mar 09 09:57:36 crc kubenswrapper[4792]: E0309 09:57:36.982524 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047ab0a5-633d-4731-a534-fd2db3b65b43" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.982534 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="047ab0a5-633d-4731-a534-fd2db3b65b43" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 09:57:36 crc kubenswrapper[4792]: E0309 09:57:36.982565 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2abf489-509d-489a-b7ee-3af10c7c5a4b" containerName="extract-utilities" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.982575 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2abf489-509d-489a-b7ee-3af10c7c5a4b" containerName="extract-utilities" Mar 09 09:57:36 crc kubenswrapper[4792]: E0309 09:57:36.982593 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2abf489-509d-489a-b7ee-3af10c7c5a4b" containerName="registry-server" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.982600 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2abf489-509d-489a-b7ee-3af10c7c5a4b" containerName="registry-server" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.982946 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2abf489-509d-489a-b7ee-3af10c7c5a4b" containerName="registry-server" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.982975 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="047ab0a5-633d-4731-a534-fd2db3b65b43" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.983882 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.985871 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4g5l6" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.986108 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.988595 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.988635 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.988607 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.988776 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.989062 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.989454 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 09 09:57:36 crc kubenswrapper[4792]: I0309 09:57:36.989673 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.001818 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr"] Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.077215 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.077270 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.077415 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.077498 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.077542 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.077569 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.077603 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wklj\" (UniqueName: \"kubernetes.io/projected/50f74681-04e5-49c7-9d32-1e8841867bcb-kube-api-access-2wklj\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.077644 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.077664 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/50f74681-04e5-49c7-9d32-1e8841867bcb-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.077713 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.077735 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.077764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.077784 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.179218 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/50f74681-04e5-49c7-9d32-1e8841867bcb-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.179302 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.179325 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.179354 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.179377 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.179398 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.179420 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.179456 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.179487 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.179514 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.179531 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.179555 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wklj\" (UniqueName: \"kubernetes.io/projected/50f74681-04e5-49c7-9d32-1e8841867bcb-kube-api-access-2wklj\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.179581 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.180860 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.181662 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/50f74681-04e5-49c7-9d32-1e8841867bcb-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.184417 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.185727 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.186513 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.186723 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.187592 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.188007 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.189020 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.190033 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.198328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.201544 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.201666 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wklj\" (UniqueName: \"kubernetes.io/projected/50f74681-04e5-49c7-9d32-1e8841867bcb-kube-api-access-2wklj\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.326093 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 09:57:37 crc kubenswrapper[4792]: I0309 09:57:37.910425 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr"] Mar 09 09:57:38 crc kubenswrapper[4792]: I0309 09:57:38.872808 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" event={"ID":"50f74681-04e5-49c7-9d32-1e8841867bcb","Type":"ContainerStarted","Data":"8c860e3ec3d6342000310cb760f1cde91501a86fac3e2264c8fb6e3cb497006e"} Mar 09 09:57:39 crc kubenswrapper[4792]: I0309 09:57:39.882447 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" event={"ID":"50f74681-04e5-49c7-9d32-1e8841867bcb","Type":"ContainerStarted","Data":"55cc7790032fd61945c3d03eb102a03037f3250f1410dd91c9b62ba9cd932fae"} Mar 09 09:57:39 crc kubenswrapper[4792]: I0309 09:57:39.912505 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" podStartSLOduration=3.204662377 podStartE2EDuration="3.912484245s" podCreationTimestamp="2026-03-09 09:57:36 +0000 UTC" firstStartedPulling="2026-03-09 09:57:37.915152306 +0000 UTC m=+3022.945353058" lastFinishedPulling="2026-03-09 09:57:38.622974174 +0000 UTC m=+3023.653174926" observedRunningTime="2026-03-09 09:57:39.909399892 +0000 UTC m=+3024.939600644" watchObservedRunningTime="2026-03-09 09:57:39.912484245 +0000 UTC m=+3024.942684997" Mar 09 09:57:48 crc kubenswrapper[4792]: I0309 09:57:48.662747 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:57:48 crc kubenswrapper[4792]: E0309 09:57:48.663498 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:58:00 crc kubenswrapper[4792]: I0309 09:58:00.146141 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550838-p4hmx"] Mar 09 09:58:00 crc kubenswrapper[4792]: I0309 09:58:00.148207 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550838-p4hmx" Mar 09 09:58:00 crc kubenswrapper[4792]: I0309 09:58:00.153272 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:58:00 crc kubenswrapper[4792]: I0309 09:58:00.153703 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:58:00 crc kubenswrapper[4792]: I0309 09:58:00.154328 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 09:58:00 crc kubenswrapper[4792]: I0309 09:58:00.154738 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550838-p4hmx"] Mar 09 09:58:00 crc kubenswrapper[4792]: I0309 09:58:00.227615 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trcqk\" (UniqueName: \"kubernetes.io/projected/32655fe9-88fd-433c-bdc9-21077ef6d0e2-kube-api-access-trcqk\") pod \"auto-csr-approver-29550838-p4hmx\" (UID: \"32655fe9-88fd-433c-bdc9-21077ef6d0e2\") " pod="openshift-infra/auto-csr-approver-29550838-p4hmx" Mar 09 09:58:00 crc kubenswrapper[4792]: I0309 09:58:00.329977 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trcqk\" (UniqueName: \"kubernetes.io/projected/32655fe9-88fd-433c-bdc9-21077ef6d0e2-kube-api-access-trcqk\") pod \"auto-csr-approver-29550838-p4hmx\" (UID: \"32655fe9-88fd-433c-bdc9-21077ef6d0e2\") " pod="openshift-infra/auto-csr-approver-29550838-p4hmx" Mar 09 09:58:00 crc kubenswrapper[4792]: I0309 09:58:00.353240 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trcqk\" (UniqueName: \"kubernetes.io/projected/32655fe9-88fd-433c-bdc9-21077ef6d0e2-kube-api-access-trcqk\") pod \"auto-csr-approver-29550838-p4hmx\" (UID: \"32655fe9-88fd-433c-bdc9-21077ef6d0e2\") " pod="openshift-infra/auto-csr-approver-29550838-p4hmx" Mar 09 09:58:00 crc kubenswrapper[4792]: I0309 09:58:00.467457 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550838-p4hmx" Mar 09 09:58:00 crc kubenswrapper[4792]: I0309 09:58:00.663399 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:58:00 crc kubenswrapper[4792]: E0309 09:58:00.663966 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:58:00 crc kubenswrapper[4792]: I0309 09:58:00.931060 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550838-p4hmx"] Mar 09 09:58:00 crc kubenswrapper[4792]: I0309 09:58:00.937229 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:58:01 crc kubenswrapper[4792]: I0309 09:58:01.047140 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550838-p4hmx" event={"ID":"32655fe9-88fd-433c-bdc9-21077ef6d0e2","Type":"ContainerStarted","Data":"72bb79be339de1fe4f3469944031c1f0c417211e2a1366046f3788b016786682"} Mar 09 09:58:03 crc kubenswrapper[4792]: I0309 09:58:03.070906 4792 generic.go:334] "Generic (PLEG): container finished" podID="32655fe9-88fd-433c-bdc9-21077ef6d0e2" containerID="5f4cd8094a9485ff642ff92b857161b5eebd3768a14918b766ee958496460106" exitCode=0 Mar 09 09:58:03 crc kubenswrapper[4792]: I0309 09:58:03.071023 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550838-p4hmx" event={"ID":"32655fe9-88fd-433c-bdc9-21077ef6d0e2","Type":"ContainerDied","Data":"5f4cd8094a9485ff642ff92b857161b5eebd3768a14918b766ee958496460106"} Mar 09 09:58:04 crc kubenswrapper[4792]: I0309 09:58:04.434559 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550838-p4hmx" Mar 09 09:58:04 crc kubenswrapper[4792]: I0309 09:58:04.511358 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trcqk\" (UniqueName: \"kubernetes.io/projected/32655fe9-88fd-433c-bdc9-21077ef6d0e2-kube-api-access-trcqk\") pod \"32655fe9-88fd-433c-bdc9-21077ef6d0e2\" (UID: \"32655fe9-88fd-433c-bdc9-21077ef6d0e2\") " Mar 09 09:58:04 crc kubenswrapper[4792]: I0309 09:58:04.519452 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32655fe9-88fd-433c-bdc9-21077ef6d0e2-kube-api-access-trcqk" (OuterVolumeSpecName: "kube-api-access-trcqk") pod "32655fe9-88fd-433c-bdc9-21077ef6d0e2" (UID: "32655fe9-88fd-433c-bdc9-21077ef6d0e2"). InnerVolumeSpecName "kube-api-access-trcqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:04 crc kubenswrapper[4792]: I0309 09:58:04.614342 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trcqk\" (UniqueName: \"kubernetes.io/projected/32655fe9-88fd-433c-bdc9-21077ef6d0e2-kube-api-access-trcqk\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:05 crc kubenswrapper[4792]: I0309 09:58:05.088229 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550838-p4hmx" event={"ID":"32655fe9-88fd-433c-bdc9-21077ef6d0e2","Type":"ContainerDied","Data":"72bb79be339de1fe4f3469944031c1f0c417211e2a1366046f3788b016786682"} Mar 09 09:58:05 crc kubenswrapper[4792]: I0309 09:58:05.088844 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72bb79be339de1fe4f3469944031c1f0c417211e2a1366046f3788b016786682" Mar 09 09:58:05 crc kubenswrapper[4792]: I0309 09:58:05.088312 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550838-p4hmx" Mar 09 09:58:05 crc kubenswrapper[4792]: I0309 09:58:05.507785 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550832-xbtmw"] Mar 09 09:58:05 crc kubenswrapper[4792]: I0309 09:58:05.515769 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550832-xbtmw"] Mar 09 09:58:05 crc kubenswrapper[4792]: I0309 09:58:05.673243 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="268467a6-7249-4a89-ba21-f60f71ec2336" path="/var/lib/kubelet/pods/268467a6-7249-4a89-ba21-f60f71ec2336/volumes" Mar 09 09:58:13 crc kubenswrapper[4792]: I0309 09:58:13.663170 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:58:13 crc kubenswrapper[4792]: E0309 09:58:13.664101 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:58:25 crc kubenswrapper[4792]: I0309 09:58:25.670517 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:58:25 crc kubenswrapper[4792]: E0309 09:58:25.671730 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:58:40 crc kubenswrapper[4792]: I0309 09:58:40.335687 4792 scope.go:117] "RemoveContainer" containerID="18df2e4013f6415c9778a13958cdced71b608bf187983fd5da52144d930f82de" Mar 09 09:58:40 crc kubenswrapper[4792]: I0309 09:58:40.662800 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:58:40 crc kubenswrapper[4792]: E0309 09:58:40.663433 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:58:53 crc kubenswrapper[4792]: I0309 09:58:53.662645 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:58:53 crc kubenswrapper[4792]: E0309 09:58:53.663486 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:59:04 crc kubenswrapper[4792]: I0309 09:59:04.663910 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:59:04 crc kubenswrapper[4792]: E0309 09:59:04.664613 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:59:19 crc kubenswrapper[4792]: I0309 09:59:19.663487 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:59:19 crc kubenswrapper[4792]: E0309 09:59:19.664649 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:59:34 crc kubenswrapper[4792]: I0309 09:59:34.662895 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:59:34 crc kubenswrapper[4792]: E0309 09:59:34.663557 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:59:48 crc kubenswrapper[4792]: I0309 09:59:48.663189 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:59:48 crc kubenswrapper[4792]: E0309 09:59:48.663962 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 09:59:59 crc kubenswrapper[4792]: I0309 09:59:59.662394 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 09:59:59 crc kubenswrapper[4792]: E0309 09:59:59.663231 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.147470 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550840-7q7wm"] Mar 09 10:00:00 crc kubenswrapper[4792]: E0309 10:00:00.148103 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32655fe9-88fd-433c-bdc9-21077ef6d0e2" containerName="oc" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.148124 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="32655fe9-88fd-433c-bdc9-21077ef6d0e2" containerName="oc" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.148364 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="32655fe9-88fd-433c-bdc9-21077ef6d0e2" containerName="oc" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.149112 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550840-7q7wm" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.152271 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.156140 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.156608 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.159090 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550840-7q7wm"] Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.258315 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj"] Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.258469 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lz67\" (UniqueName: \"kubernetes.io/projected/edc07ac6-aa9a-477f-a990-62d1cbb5b7d1-kube-api-access-9lz67\") pod \"auto-csr-approver-29550840-7q7wm\" (UID: \"edc07ac6-aa9a-477f-a990-62d1cbb5b7d1\") " pod="openshift-infra/auto-csr-approver-29550840-7q7wm" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.260099 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.265408 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.268491 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj"] Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.273678 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.360297 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd250617-15c5-4d27-8f0d-a23f7ca702c9-secret-volume\") pod \"collect-profiles-29550840-m88sj\" (UID: \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.360912 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd250617-15c5-4d27-8f0d-a23f7ca702c9-config-volume\") pod \"collect-profiles-29550840-m88sj\" (UID: \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.361063 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8bpt\" (UniqueName: \"kubernetes.io/projected/fd250617-15c5-4d27-8f0d-a23f7ca702c9-kube-api-access-f8bpt\") pod \"collect-profiles-29550840-m88sj\" (UID: \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.361225 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lz67\" (UniqueName: \"kubernetes.io/projected/edc07ac6-aa9a-477f-a990-62d1cbb5b7d1-kube-api-access-9lz67\") pod \"auto-csr-approver-29550840-7q7wm\" (UID: \"edc07ac6-aa9a-477f-a990-62d1cbb5b7d1\") " pod="openshift-infra/auto-csr-approver-29550840-7q7wm" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.382709 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lz67\" (UniqueName: \"kubernetes.io/projected/edc07ac6-aa9a-477f-a990-62d1cbb5b7d1-kube-api-access-9lz67\") pod \"auto-csr-approver-29550840-7q7wm\" (UID: \"edc07ac6-aa9a-477f-a990-62d1cbb5b7d1\") " pod="openshift-infra/auto-csr-approver-29550840-7q7wm" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.463584 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd250617-15c5-4d27-8f0d-a23f7ca702c9-secret-volume\") pod \"collect-profiles-29550840-m88sj\" (UID: \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.463674 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd250617-15c5-4d27-8f0d-a23f7ca702c9-config-volume\") pod \"collect-profiles-29550840-m88sj\" (UID: \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.463721 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8bpt\" (UniqueName: \"kubernetes.io/projected/fd250617-15c5-4d27-8f0d-a23f7ca702c9-kube-api-access-f8bpt\") pod \"collect-profiles-29550840-m88sj\" (UID: \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.464794 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd250617-15c5-4d27-8f0d-a23f7ca702c9-config-volume\") pod \"collect-profiles-29550840-m88sj\" (UID: \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.468723 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd250617-15c5-4d27-8f0d-a23f7ca702c9-secret-volume\") pod \"collect-profiles-29550840-m88sj\" (UID: \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.479859 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550840-7q7wm" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.495607 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8bpt\" (UniqueName: \"kubernetes.io/projected/fd250617-15c5-4d27-8f0d-a23f7ca702c9-kube-api-access-f8bpt\") pod \"collect-profiles-29550840-m88sj\" (UID: \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" Mar 09 10:00:00 crc kubenswrapper[4792]: I0309 10:00:00.582331 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" Mar 09 10:00:01 crc kubenswrapper[4792]: I0309 10:00:00.997548 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550840-7q7wm"] Mar 09 10:00:01 crc kubenswrapper[4792]: I0309 10:00:01.076167 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550840-7q7wm" event={"ID":"edc07ac6-aa9a-477f-a990-62d1cbb5b7d1","Type":"ContainerStarted","Data":"b97c480456d048e6004b280371ff13454a90da2d3b83fe8866c56db62ef169ea"} Mar 09 10:00:01 crc kubenswrapper[4792]: I0309 10:00:01.122696 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj"] Mar 09 10:00:01 crc kubenswrapper[4792]: W0309 10:00:01.130520 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd250617_15c5_4d27_8f0d_a23f7ca702c9.slice/crio-c5ef022a0600b85d824b2cacccebbebd5a82f67b5576a207f19252984a48ab67 WatchSource:0}: Error finding container c5ef022a0600b85d824b2cacccebbebd5a82f67b5576a207f19252984a48ab67: Status 404 returned error can't find the container with id c5ef022a0600b85d824b2cacccebbebd5a82f67b5576a207f19252984a48ab67 Mar 09 10:00:02 crc kubenswrapper[4792]: I0309 10:00:02.086333 4792 generic.go:334] "Generic (PLEG): container finished" podID="fd250617-15c5-4d27-8f0d-a23f7ca702c9" containerID="854cbd63308bfa397eb46b6d929cd9aaca5fc9e28c84a88e72391f9a6870f812" exitCode=0 Mar 09 10:00:02 crc kubenswrapper[4792]: I0309 10:00:02.086432 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" event={"ID":"fd250617-15c5-4d27-8f0d-a23f7ca702c9","Type":"ContainerDied","Data":"854cbd63308bfa397eb46b6d929cd9aaca5fc9e28c84a88e72391f9a6870f812"} Mar 09 10:00:02 crc kubenswrapper[4792]: I0309 10:00:02.086961 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" event={"ID":"fd250617-15c5-4d27-8f0d-a23f7ca702c9","Type":"ContainerStarted","Data":"c5ef022a0600b85d824b2cacccebbebd5a82f67b5576a207f19252984a48ab67"} Mar 09 10:00:03 crc kubenswrapper[4792]: I0309 10:00:03.460020 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" Mar 09 10:00:03 crc kubenswrapper[4792]: I0309 10:00:03.622135 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd250617-15c5-4d27-8f0d-a23f7ca702c9-secret-volume\") pod \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\" (UID: \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\") " Mar 09 10:00:03 crc kubenswrapper[4792]: I0309 10:00:03.622230 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd250617-15c5-4d27-8f0d-a23f7ca702c9-config-volume\") pod \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\" (UID: \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\") " Mar 09 10:00:03 crc kubenswrapper[4792]: I0309 10:00:03.622305 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8bpt\" (UniqueName: \"kubernetes.io/projected/fd250617-15c5-4d27-8f0d-a23f7ca702c9-kube-api-access-f8bpt\") pod \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\" (UID: \"fd250617-15c5-4d27-8f0d-a23f7ca702c9\") " Mar 09 10:00:03 crc kubenswrapper[4792]: I0309 10:00:03.623923 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd250617-15c5-4d27-8f0d-a23f7ca702c9-config-volume" (OuterVolumeSpecName: "config-volume") pod "fd250617-15c5-4d27-8f0d-a23f7ca702c9" (UID: "fd250617-15c5-4d27-8f0d-a23f7ca702c9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:03 crc kubenswrapper[4792]: I0309 10:00:03.628959 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd250617-15c5-4d27-8f0d-a23f7ca702c9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fd250617-15c5-4d27-8f0d-a23f7ca702c9" (UID: "fd250617-15c5-4d27-8f0d-a23f7ca702c9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:03 crc kubenswrapper[4792]: I0309 10:00:03.630248 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd250617-15c5-4d27-8f0d-a23f7ca702c9-kube-api-access-f8bpt" (OuterVolumeSpecName: "kube-api-access-f8bpt") pod "fd250617-15c5-4d27-8f0d-a23f7ca702c9" (UID: "fd250617-15c5-4d27-8f0d-a23f7ca702c9"). InnerVolumeSpecName "kube-api-access-f8bpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:03 crc kubenswrapper[4792]: I0309 10:00:03.724430 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fd250617-15c5-4d27-8f0d-a23f7ca702c9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:03 crc kubenswrapper[4792]: I0309 10:00:03.724747 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd250617-15c5-4d27-8f0d-a23f7ca702c9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:03 crc kubenswrapper[4792]: I0309 10:00:03.724758 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8bpt\" (UniqueName: \"kubernetes.io/projected/fd250617-15c5-4d27-8f0d-a23f7ca702c9-kube-api-access-f8bpt\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:04 crc kubenswrapper[4792]: I0309 10:00:04.102573 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" event={"ID":"fd250617-15c5-4d27-8f0d-a23f7ca702c9","Type":"ContainerDied","Data":"c5ef022a0600b85d824b2cacccebbebd5a82f67b5576a207f19252984a48ab67"} Mar 09 10:00:04 crc kubenswrapper[4792]: I0309 10:00:04.102612 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5ef022a0600b85d824b2cacccebbebd5a82f67b5576a207f19252984a48ab67" Mar 09 10:00:04 crc kubenswrapper[4792]: I0309 10:00:04.102638 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-m88sj" Mar 09 10:00:04 crc kubenswrapper[4792]: I0309 10:00:04.553532 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz"] Mar 09 10:00:04 crc kubenswrapper[4792]: I0309 10:00:04.565206 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-8jtvz"] Mar 09 10:00:05 crc kubenswrapper[4792]: I0309 10:00:05.113039 4792 generic.go:334] "Generic (PLEG): container finished" podID="edc07ac6-aa9a-477f-a990-62d1cbb5b7d1" containerID="ccb893011016c45ea0f02e0e75699909f8e01bae5f90cccb437e81aa97742ee6" exitCode=0 Mar 09 10:00:05 crc kubenswrapper[4792]: I0309 10:00:05.113106 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550840-7q7wm" event={"ID":"edc07ac6-aa9a-477f-a990-62d1cbb5b7d1","Type":"ContainerDied","Data":"ccb893011016c45ea0f02e0e75699909f8e01bae5f90cccb437e81aa97742ee6"} Mar 09 10:00:05 crc kubenswrapper[4792]: I0309 10:00:05.673898 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37afd809-afca-4aa4-a9c9-3fe02f105c23" path="/var/lib/kubelet/pods/37afd809-afca-4aa4-a9c9-3fe02f105c23/volumes" Mar 09 10:00:06 crc kubenswrapper[4792]: I0309 10:00:06.547195 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550840-7q7wm" Mar 09 10:00:06 crc kubenswrapper[4792]: I0309 10:00:06.676768 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lz67\" (UniqueName: \"kubernetes.io/projected/edc07ac6-aa9a-477f-a990-62d1cbb5b7d1-kube-api-access-9lz67\") pod \"edc07ac6-aa9a-477f-a990-62d1cbb5b7d1\" (UID: \"edc07ac6-aa9a-477f-a990-62d1cbb5b7d1\") " Mar 09 10:00:06 crc kubenswrapper[4792]: I0309 10:00:06.690405 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edc07ac6-aa9a-477f-a990-62d1cbb5b7d1-kube-api-access-9lz67" (OuterVolumeSpecName: "kube-api-access-9lz67") pod "edc07ac6-aa9a-477f-a990-62d1cbb5b7d1" (UID: "edc07ac6-aa9a-477f-a990-62d1cbb5b7d1"). InnerVolumeSpecName "kube-api-access-9lz67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:06 crc kubenswrapper[4792]: I0309 10:00:06.779527 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lz67\" (UniqueName: \"kubernetes.io/projected/edc07ac6-aa9a-477f-a990-62d1cbb5b7d1-kube-api-access-9lz67\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:07 crc kubenswrapper[4792]: I0309 10:00:07.146033 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550840-7q7wm" event={"ID":"edc07ac6-aa9a-477f-a990-62d1cbb5b7d1","Type":"ContainerDied","Data":"b97c480456d048e6004b280371ff13454a90da2d3b83fe8866c56db62ef169ea"} Mar 09 10:00:07 crc kubenswrapper[4792]: I0309 10:00:07.146388 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b97c480456d048e6004b280371ff13454a90da2d3b83fe8866c56db62ef169ea" Mar 09 10:00:07 crc kubenswrapper[4792]: I0309 10:00:07.146204 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550840-7q7wm" Mar 09 10:00:07 crc kubenswrapper[4792]: I0309 10:00:07.637824 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550834-fpmjc"] Mar 09 10:00:07 crc kubenswrapper[4792]: I0309 10:00:07.653354 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550834-fpmjc"] Mar 09 10:00:07 crc kubenswrapper[4792]: I0309 10:00:07.689597 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc38964a-cf56-41dd-8e12-2897e356db7a" path="/var/lib/kubelet/pods/fc38964a-cf56-41dd-8e12-2897e356db7a/volumes" Mar 09 10:00:10 crc kubenswrapper[4792]: I0309 10:00:10.663685 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 10:00:10 crc kubenswrapper[4792]: E0309 10:00:10.664482 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:00:20 crc kubenswrapper[4792]: I0309 10:00:20.251299 4792 generic.go:334] "Generic (PLEG): container finished" podID="50f74681-04e5-49c7-9d32-1e8841867bcb" containerID="55cc7790032fd61945c3d03eb102a03037f3250f1410dd91c9b62ba9cd932fae" exitCode=0 Mar 09 10:00:20 crc kubenswrapper[4792]: I0309 10:00:20.251387 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" event={"ID":"50f74681-04e5-49c7-9d32-1e8841867bcb","Type":"ContainerDied","Data":"55cc7790032fd61945c3d03eb102a03037f3250f1410dd91c9b62ba9cd932fae"} Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.636837 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.755542 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-extra-config-0\") pod \"50f74681-04e5-49c7-9d32-1e8841867bcb\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.755597 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-ceph\") pod \"50f74681-04e5-49c7-9d32-1e8841867bcb\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.755646 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/50f74681-04e5-49c7-9d32-1e8841867bcb-ceph-nova-0\") pod \"50f74681-04e5-49c7-9d32-1e8841867bcb\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.755671 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-migration-ssh-key-1\") pod \"50f74681-04e5-49c7-9d32-1e8841867bcb\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.755711 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-ssh-key-openstack-edpm-ipam\") pod \"50f74681-04e5-49c7-9d32-1e8841867bcb\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.755763 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-inventory\") pod \"50f74681-04e5-49c7-9d32-1e8841867bcb\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.755810 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-1\") pod \"50f74681-04e5-49c7-9d32-1e8841867bcb\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.755862 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wklj\" (UniqueName: \"kubernetes.io/projected/50f74681-04e5-49c7-9d32-1e8841867bcb-kube-api-access-2wklj\") pod \"50f74681-04e5-49c7-9d32-1e8841867bcb\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.755903 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-2\") pod \"50f74681-04e5-49c7-9d32-1e8841867bcb\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.755945 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-3\") pod \"50f74681-04e5-49c7-9d32-1e8841867bcb\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.756024 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-migration-ssh-key-0\") pod \"50f74681-04e5-49c7-9d32-1e8841867bcb\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.756056 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-custom-ceph-combined-ca-bundle\") pod \"50f74681-04e5-49c7-9d32-1e8841867bcb\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.756134 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-0\") pod \"50f74681-04e5-49c7-9d32-1e8841867bcb\" (UID: \"50f74681-04e5-49c7-9d32-1e8841867bcb\") " Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.761161 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "50f74681-04e5-49c7-9d32-1e8841867bcb" (UID: "50f74681-04e5-49c7-9d32-1e8841867bcb"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.778874 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "50f74681-04e5-49c7-9d32-1e8841867bcb" (UID: "50f74681-04e5-49c7-9d32-1e8841867bcb"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.778610 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f74681-04e5-49c7-9d32-1e8841867bcb-kube-api-access-2wklj" (OuterVolumeSpecName: "kube-api-access-2wklj") pod "50f74681-04e5-49c7-9d32-1e8841867bcb" (UID: "50f74681-04e5-49c7-9d32-1e8841867bcb"). InnerVolumeSpecName "kube-api-access-2wklj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.783005 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-ceph" (OuterVolumeSpecName: "ceph") pod "50f74681-04e5-49c7-9d32-1e8841867bcb" (UID: "50f74681-04e5-49c7-9d32-1e8841867bcb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.783652 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "50f74681-04e5-49c7-9d32-1e8841867bcb" (UID: "50f74681-04e5-49c7-9d32-1e8841867bcb"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.785636 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "50f74681-04e5-49c7-9d32-1e8841867bcb" (UID: "50f74681-04e5-49c7-9d32-1e8841867bcb"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.791468 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "50f74681-04e5-49c7-9d32-1e8841867bcb" (UID: "50f74681-04e5-49c7-9d32-1e8841867bcb"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.791768 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "50f74681-04e5-49c7-9d32-1e8841867bcb" (UID: "50f74681-04e5-49c7-9d32-1e8841867bcb"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.792044 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "50f74681-04e5-49c7-9d32-1e8841867bcb" (UID: "50f74681-04e5-49c7-9d32-1e8841867bcb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.806359 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50f74681-04e5-49c7-9d32-1e8841867bcb-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "50f74681-04e5-49c7-9d32-1e8841867bcb" (UID: "50f74681-04e5-49c7-9d32-1e8841867bcb"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.806595 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "50f74681-04e5-49c7-9d32-1e8841867bcb" (UID: "50f74681-04e5-49c7-9d32-1e8841867bcb"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.809910 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-inventory" (OuterVolumeSpecName: "inventory") pod "50f74681-04e5-49c7-9d32-1e8841867bcb" (UID: "50f74681-04e5-49c7-9d32-1e8841867bcb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.823497 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "50f74681-04e5-49c7-9d32-1e8841867bcb" (UID: "50f74681-04e5-49c7-9d32-1e8841867bcb"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.858933 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.858970 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wklj\" (UniqueName: \"kubernetes.io/projected/50f74681-04e5-49c7-9d32-1e8841867bcb-kube-api-access-2wklj\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.858980 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.858990 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.859000 4792 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.859011 4792 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.859020 4792 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.859127 4792 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.859147 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.859209 4792 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/50f74681-04e5-49c7-9d32-1e8841867bcb-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.859222 4792 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.859232 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:21 crc kubenswrapper[4792]: I0309 10:00:21.859244 4792 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50f74681-04e5-49c7-9d32-1e8841867bcb-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:22 crc kubenswrapper[4792]: I0309 10:00:22.270831 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" event={"ID":"50f74681-04e5-49c7-9d32-1e8841867bcb","Type":"ContainerDied","Data":"8c860e3ec3d6342000310cb760f1cde91501a86fac3e2264c8fb6e3cb497006e"} Mar 09 10:00:22 crc kubenswrapper[4792]: I0309 10:00:22.271199 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c860e3ec3d6342000310cb760f1cde91501a86fac3e2264c8fb6e3cb497006e" Mar 09 10:00:22 crc kubenswrapper[4792]: I0309 10:00:22.270919 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr" Mar 09 10:00:22 crc kubenswrapper[4792]: I0309 10:00:22.662433 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 10:00:22 crc kubenswrapper[4792]: E0309 10:00:22.662759 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:00:35 crc kubenswrapper[4792]: I0309 10:00:35.668934 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 10:00:35 crc kubenswrapper[4792]: E0309 10:00:35.669972 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.550817 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 09 10:00:37 crc kubenswrapper[4792]: E0309 10:00:37.551569 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd250617-15c5-4d27-8f0d-a23f7ca702c9" containerName="collect-profiles" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.551586 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd250617-15c5-4d27-8f0d-a23f7ca702c9" containerName="collect-profiles" Mar 09 10:00:37 crc kubenswrapper[4792]: E0309 10:00:37.551611 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f74681-04e5-49c7-9d32-1e8841867bcb" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.551620 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f74681-04e5-49c7-9d32-1e8841867bcb" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 09 10:00:37 crc kubenswrapper[4792]: E0309 10:00:37.551643 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edc07ac6-aa9a-477f-a990-62d1cbb5b7d1" containerName="oc" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.551650 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc07ac6-aa9a-477f-a990-62d1cbb5b7d1" containerName="oc" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.551844 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd250617-15c5-4d27-8f0d-a23f7ca702c9" containerName="collect-profiles" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.551876 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="edc07ac6-aa9a-477f-a990-62d1cbb5b7d1" containerName="oc" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.551891 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f74681-04e5-49c7-9d32-1e8841867bcb" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.553148 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.559979 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.560236 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.591028 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.640099 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.641745 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.646884 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688240 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-run\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688306 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688339 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688367 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688393 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688431 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688452 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688480 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-sys\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688519 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flhsz\" (UniqueName: \"kubernetes.io/projected/61d989fe-045d-4c58-b660-f9d0e1a482f9-kube-api-access-flhsz\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688589 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61d989fe-045d-4c58-b660-f9d0e1a482f9-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688620 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d989fe-045d-4c58-b660-f9d0e1a482f9-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688645 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61d989fe-045d-4c58-b660-f9d0e1a482f9-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688663 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688717 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d989fe-045d-4c58-b660-f9d0e1a482f9-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688740 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-dev\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.688763 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/61d989fe-045d-4c58-b660-f9d0e1a482f9-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.744234 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802197 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eaee6b3-8397-430c-b799-7628762d1701-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802309 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-run\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802337 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-run\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802386 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802419 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802440 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-lib-modules\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802465 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802498 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802518 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eaee6b3-8397-430c-b799-7628762d1701-scripts\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802548 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-sys\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802575 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802597 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802622 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802689 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802726 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802755 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-sys\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802803 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2eaee6b3-8397-430c-b799-7628762d1701-ceph\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802834 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flhsz\" (UniqueName: \"kubernetes.io/projected/61d989fe-045d-4c58-b660-f9d0e1a482f9-kube-api-access-flhsz\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802896 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61d989fe-045d-4c58-b660-f9d0e1a482f9-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802925 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802950 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d989fe-045d-4c58-b660-f9d0e1a482f9-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.802983 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61d989fe-045d-4c58-b660-f9d0e1a482f9-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.803005 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.803047 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtgkn\" (UniqueName: \"kubernetes.io/projected/2eaee6b3-8397-430c-b799-7628762d1701-kube-api-access-xtgkn\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.803117 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-run\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.803163 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.803227 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.803048 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.804205 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.804285 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.804315 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-sys\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.804330 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.804450 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.804627 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.804663 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d989fe-045d-4c58-b660-f9d0e1a482f9-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.807449 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-dev\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.807532 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-dev\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.807559 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/61d989fe-045d-4c58-b660-f9d0e1a482f9-dev\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.807633 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eaee6b3-8397-430c-b799-7628762d1701-config-data\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.807668 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2eaee6b3-8397-430c-b799-7628762d1701-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.807725 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/61d989fe-045d-4c58-b660-f9d0e1a482f9-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.819731 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61d989fe-045d-4c58-b660-f9d0e1a482f9-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.819761 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61d989fe-045d-4c58-b660-f9d0e1a482f9-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.820651 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/61d989fe-045d-4c58-b660-f9d0e1a482f9-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.822362 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61d989fe-045d-4c58-b660-f9d0e1a482f9-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.823936 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flhsz\" (UniqueName: \"kubernetes.io/projected/61d989fe-045d-4c58-b660-f9d0e1a482f9-kube-api-access-flhsz\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.833497 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61d989fe-045d-4c58-b660-f9d0e1a482f9-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"61d989fe-045d-4c58-b660-f9d0e1a482f9\") " pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.875319 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.914452 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2eaee6b3-8397-430c-b799-7628762d1701-ceph\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.914816 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.914935 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtgkn\" (UniqueName: \"kubernetes.io/projected/2eaee6b3-8397-430c-b799-7628762d1701-kube-api-access-xtgkn\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.914972 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.914990 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-dev\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.915006 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eaee6b3-8397-430c-b799-7628762d1701-config-data\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.915048 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2eaee6b3-8397-430c-b799-7628762d1701-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.915095 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eaee6b3-8397-430c-b799-7628762d1701-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.915162 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-run\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.915225 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.915257 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.915315 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-lib-modules\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.915341 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eaee6b3-8397-430c-b799-7628762d1701-scripts\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.915361 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-sys\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.915380 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.915399 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.916197 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-run\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.916667 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.916958 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.916991 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-dev\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.919661 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-sys\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.919827 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.919949 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-lib-modules\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.920023 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.936162 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2eaee6b3-8397-430c-b799-7628762d1701-ceph\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.936425 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eaee6b3-8397-430c-b799-7628762d1701-config-data\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.936566 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eaee6b3-8397-430c-b799-7628762d1701-scripts\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.936781 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2eaee6b3-8397-430c-b799-7628762d1701-config-data-custom\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.939561 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.939643 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2eaee6b3-8397-430c-b799-7628762d1701-etc-nvme\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.943691 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eaee6b3-8397-430c-b799-7628762d1701-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:37 crc kubenswrapper[4792]: I0309 10:00:37.974586 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtgkn\" (UniqueName: \"kubernetes.io/projected/2eaee6b3-8397-430c-b799-7628762d1701-kube-api-access-xtgkn\") pod \"cinder-backup-0\" (UID: \"2eaee6b3-8397-430c-b799-7628762d1701\") " pod="openstack/cinder-backup-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.027777 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.425842 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f9cb7fbd9-gcq9k"] Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.428186 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.439819 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-gfg2d" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.448169 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.448427 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.448588 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.487148 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f9cb7fbd9-gcq9k"] Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.508252 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.514881 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.526640 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-t8kmn" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.526874 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.526951 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.527021 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.530838 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8646be85-08e4-4f0d-9484-01456e453c36-config-data\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.530991 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8646be85-08e4-4f0d-9484-01456e453c36-scripts\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.531062 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8646be85-08e4-4f0d-9484-01456e453c36-logs\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.531187 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zcxv\" (UniqueName: \"kubernetes.io/projected/8646be85-08e4-4f0d-9484-01456e453c36-kube-api-access-9zcxv\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.531224 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8646be85-08e4-4f0d-9484-01456e453c36-horizon-secret-key\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.563786 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.587155 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:38 crc kubenswrapper[4792]: E0309 10:00:38.587857 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run kube-api-access-94ltn logs public-tls-certs scripts], unattached volumes=[], failed to process volumes=[ceph combined-ca-bundle config-data glance httpd-run kube-api-access-94ltn logs public-tls-certs scripts]: context canceled" pod="openstack/glance-default-external-api-0" podUID="aa512688-fab8-4291-9260-52f1812c6b71" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.600546 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.602760 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.607690 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.607758 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.610565 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.634771 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aa512688-fab8-4291-9260-52f1812c6b71-ceph\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.634828 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.634875 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zcxv\" (UniqueName: \"kubernetes.io/projected/8646be85-08e4-4f0d-9484-01456e453c36-kube-api-access-9zcxv\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.634907 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8646be85-08e4-4f0d-9484-01456e453c36-horizon-secret-key\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.634942 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa512688-fab8-4291-9260-52f1812c6b71-logs\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.634978 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.635027 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.635055 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa512688-fab8-4291-9260-52f1812c6b71-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.635124 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-scripts\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.635148 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8646be85-08e4-4f0d-9484-01456e453c36-config-data\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.635178 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94ltn\" (UniqueName: \"kubernetes.io/projected/aa512688-fab8-4291-9260-52f1812c6b71-kube-api-access-94ltn\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.635213 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-config-data\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.635249 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8646be85-08e4-4f0d-9484-01456e453c36-scripts\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.638408 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8646be85-08e4-4f0d-9484-01456e453c36-scripts\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.638617 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8646be85-08e4-4f0d-9484-01456e453c36-logs\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.638718 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8646be85-08e4-4f0d-9484-01456e453c36-config-data\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.639056 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8646be85-08e4-4f0d-9484-01456e453c36-logs\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.658725 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8646be85-08e4-4f0d-9484-01456e453c36-horizon-secret-key\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.682196 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zcxv\" (UniqueName: \"kubernetes.io/projected/8646be85-08e4-4f0d-9484-01456e453c36-kube-api-access-9zcxv\") pod \"horizon-6f9cb7fbd9-gcq9k\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.717953 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-796cbc75d5-fwljj"] Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.720117 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.740314 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-scripts\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.740601 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94ltn\" (UniqueName: \"kubernetes.io/projected/aa512688-fab8-4291-9260-52f1812c6b71-kube-api-access-94ltn\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.740767 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-config-data\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.740891 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187135be-27c8-45e7-8a3c-ecd516507200-logs\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.740998 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/187135be-27c8-45e7-8a3c-ecd516507200-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.741142 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aa512688-fab8-4291-9260-52f1812c6b71-ceph\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.741231 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.741418 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-747zf\" (UniqueName: \"kubernetes.io/projected/187135be-27c8-45e7-8a3c-ecd516507200-kube-api-access-747zf\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.741557 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa512688-fab8-4291-9260-52f1812c6b71-logs\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.741682 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.741763 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-scripts\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.741853 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.741978 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.742089 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/187135be-27c8-45e7-8a3c-ecd516507200-ceph\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.742186 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa512688-fab8-4291-9260-52f1812c6b71-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.742311 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.742404 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.742502 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-config-data\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.745679 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-scripts\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.746586 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.749479 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.749619 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa512688-fab8-4291-9260-52f1812c6b71-logs\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.749669 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-796cbc75d5-fwljj"] Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.750104 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa512688-fab8-4291-9260-52f1812c6b71-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.755177 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.764866 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.778655 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aa512688-fab8-4291-9260-52f1812c6b71-ceph\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.783328 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-config-data\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.783566 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:38 crc kubenswrapper[4792]: E0309 10:00:38.788965 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-747zf logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="187135be-27c8-45e7-8a3c-ecd516507200" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.835147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94ltn\" (UniqueName: \"kubernetes.io/projected/aa512688-fab8-4291-9260-52f1812c6b71-kube-api-access-94ltn\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.845151 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w2xn\" (UniqueName: \"kubernetes.io/projected/2c175458-c03c-4901-a2ec-6250b13a461b-kube-api-access-4w2xn\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.845438 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-scripts\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.845530 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.845609 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c175458-c03c-4901-a2ec-6250b13a461b-scripts\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.845734 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/187135be-27c8-45e7-8a3c-ecd516507200-ceph\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.845871 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.845955 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.846034 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-config-data\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.846128 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c175458-c03c-4901-a2ec-6250b13a461b-logs\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.846217 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c175458-c03c-4901-a2ec-6250b13a461b-config-data\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.846297 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c175458-c03c-4901-a2ec-6250b13a461b-horizon-secret-key\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.846399 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187135be-27c8-45e7-8a3c-ecd516507200-logs\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.846483 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/187135be-27c8-45e7-8a3c-ecd516507200-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.846611 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-747zf\" (UniqueName: \"kubernetes.io/projected/187135be-27c8-45e7-8a3c-ecd516507200-kube-api-access-747zf\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.856470 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.870109 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187135be-27c8-45e7-8a3c-ecd516507200-logs\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.870387 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/187135be-27c8-45e7-8a3c-ecd516507200-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.871092 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.875352 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/187135be-27c8-45e7-8a3c-ecd516507200-ceph\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.880650 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.883252 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-291a-account-create-update-jcsfx"] Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.884414 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-291a-account-create-update-jcsfx" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.888295 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-scripts\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.890418 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.892514 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-config-data\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.906297 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-57k9s"] Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.907668 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-57k9s" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.908600 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-747zf\" (UniqueName: \"kubernetes.io/projected/187135be-27c8-45e7-8a3c-ecd516507200-kube-api-access-747zf\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.919670 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-291a-account-create-update-jcsfx"] Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.939783 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-57k9s"] Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.941227 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.950643 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w2xn\" (UniqueName: \"kubernetes.io/projected/2c175458-c03c-4901-a2ec-6250b13a461b-kube-api-access-4w2xn\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.950731 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c175458-c03c-4901-a2ec-6250b13a461b-scripts\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.950806 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6hkp\" (UniqueName: \"kubernetes.io/projected/6d8a53c4-679b-42ff-81a7-264b779e3486-kube-api-access-b6hkp\") pod \"manila-291a-account-create-update-jcsfx\" (UID: \"6d8a53c4-679b-42ff-81a7-264b779e3486\") " pod="openstack/manila-291a-account-create-update-jcsfx" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.950887 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx5zn\" (UniqueName: \"kubernetes.io/projected/3823d2bd-f551-4107-a220-6e97655832e1-kube-api-access-rx5zn\") pod \"manila-db-create-57k9s\" (UID: \"3823d2bd-f551-4107-a220-6e97655832e1\") " pod="openstack/manila-db-create-57k9s" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.950948 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3823d2bd-f551-4107-a220-6e97655832e1-operator-scripts\") pod \"manila-db-create-57k9s\" (UID: \"3823d2bd-f551-4107-a220-6e97655832e1\") " pod="openstack/manila-db-create-57k9s" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.951115 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c175458-c03c-4901-a2ec-6250b13a461b-logs\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.951196 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c175458-c03c-4901-a2ec-6250b13a461b-config-data\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.951237 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c175458-c03c-4901-a2ec-6250b13a461b-horizon-secret-key\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.951334 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d8a53c4-679b-42ff-81a7-264b779e3486-operator-scripts\") pod \"manila-291a-account-create-update-jcsfx\" (UID: \"6d8a53c4-679b-42ff-81a7-264b779e3486\") " pod="openstack/manila-291a-account-create-update-jcsfx" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.952323 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c175458-c03c-4901-a2ec-6250b13a461b-scripts\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.952371 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c175458-c03c-4901-a2ec-6250b13a461b-logs\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.954012 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c175458-c03c-4901-a2ec-6250b13a461b-config-data\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.969308 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c175458-c03c-4901-a2ec-6250b13a461b-horizon-secret-key\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.989016 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:38 crc kubenswrapper[4792]: I0309 10:00:38.990956 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.009827 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w2xn\" (UniqueName: \"kubernetes.io/projected/2c175458-c03c-4901-a2ec-6250b13a461b-kube-api-access-4w2xn\") pod \"horizon-796cbc75d5-fwljj\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.047597 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.070708 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6hkp\" (UniqueName: \"kubernetes.io/projected/6d8a53c4-679b-42ff-81a7-264b779e3486-kube-api-access-b6hkp\") pod \"manila-291a-account-create-update-jcsfx\" (UID: \"6d8a53c4-679b-42ff-81a7-264b779e3486\") " pod="openstack/manila-291a-account-create-update-jcsfx" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.070795 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx5zn\" (UniqueName: \"kubernetes.io/projected/3823d2bd-f551-4107-a220-6e97655832e1-kube-api-access-rx5zn\") pod \"manila-db-create-57k9s\" (UID: \"3823d2bd-f551-4107-a220-6e97655832e1\") " pod="openstack/manila-db-create-57k9s" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.070851 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3823d2bd-f551-4107-a220-6e97655832e1-operator-scripts\") pod \"manila-db-create-57k9s\" (UID: \"3823d2bd-f551-4107-a220-6e97655832e1\") " pod="openstack/manila-db-create-57k9s" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.071009 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d8a53c4-679b-42ff-81a7-264b779e3486-operator-scripts\") pod \"manila-291a-account-create-update-jcsfx\" (UID: \"6d8a53c4-679b-42ff-81a7-264b779e3486\") " pod="openstack/manila-291a-account-create-update-jcsfx" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.073195 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d8a53c4-679b-42ff-81a7-264b779e3486-operator-scripts\") pod \"manila-291a-account-create-update-jcsfx\" (UID: \"6d8a53c4-679b-42ff-81a7-264b779e3486\") " pod="openstack/manila-291a-account-create-update-jcsfx" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.075063 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3823d2bd-f551-4107-a220-6e97655832e1-operator-scripts\") pod \"manila-db-create-57k9s\" (UID: \"3823d2bd-f551-4107-a220-6e97655832e1\") " pod="openstack/manila-db-create-57k9s" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.098211 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6hkp\" (UniqueName: \"kubernetes.io/projected/6d8a53c4-679b-42ff-81a7-264b779e3486-kube-api-access-b6hkp\") pod \"manila-291a-account-create-update-jcsfx\" (UID: \"6d8a53c4-679b-42ff-81a7-264b779e3486\") " pod="openstack/manila-291a-account-create-update-jcsfx" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.109483 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx5zn\" (UniqueName: \"kubernetes.io/projected/3823d2bd-f551-4107-a220-6e97655832e1-kube-api-access-rx5zn\") pod \"manila-db-create-57k9s\" (UID: \"3823d2bd-f551-4107-a220-6e97655832e1\") " pod="openstack/manila-db-create-57k9s" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.119271 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-291a-account-create-update-jcsfx" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.152120 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-57k9s" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.159763 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 09 10:00:39 crc kubenswrapper[4792]: W0309 10:00:39.160727 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eaee6b3_8397_430c_b799_7628762d1701.slice/crio-f554699f58bfd5b9a443a73f2f87c551abd430f064862a99168a388ab83a9b87 WatchSource:0}: Error finding container f554699f58bfd5b9a443a73f2f87c551abd430f064862a99168a388ab83a9b87: Status 404 returned error can't find the container with id f554699f58bfd5b9a443a73f2f87c551abd430f064862a99168a388ab83a9b87 Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.440531 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"61d989fe-045d-4c58-b660-f9d0e1a482f9","Type":"ContainerStarted","Data":"ec19c3155dd07b48621a303a9fadc6ee3d7d80f356d2f8081aa487fa46030a88"} Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.449477 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.449662 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2eaee6b3-8397-430c-b799-7628762d1701","Type":"ContainerStarted","Data":"f554699f58bfd5b9a443a73f2f87c551abd430f064862a99168a388ab83a9b87"} Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.450187 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.484043 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.496660 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.581199 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-796cbc75d5-fwljj"] Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.620438 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-combined-ca-bundle\") pod \"aa512688-fab8-4291-9260-52f1812c6b71\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.623164 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-combined-ca-bundle\") pod \"187135be-27c8-45e7-8a3c-ecd516507200\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.624782 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-internal-tls-certs\") pod \"187135be-27c8-45e7-8a3c-ecd516507200\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.625141 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aa512688-fab8-4291-9260-52f1812c6b71-ceph\") pod \"aa512688-fab8-4291-9260-52f1812c6b71\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.625207 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/187135be-27c8-45e7-8a3c-ecd516507200-ceph\") pod \"187135be-27c8-45e7-8a3c-ecd516507200\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.625431 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187135be-27c8-45e7-8a3c-ecd516507200-logs\") pod \"187135be-27c8-45e7-8a3c-ecd516507200\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.625571 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-scripts\") pod \"187135be-27c8-45e7-8a3c-ecd516507200\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.625762 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/187135be-27c8-45e7-8a3c-ecd516507200-httpd-run\") pod \"187135be-27c8-45e7-8a3c-ecd516507200\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.625919 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa512688-fab8-4291-9260-52f1812c6b71-httpd-run\") pod \"aa512688-fab8-4291-9260-52f1812c6b71\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.625959 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"187135be-27c8-45e7-8a3c-ecd516507200\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.630082 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa512688-fab8-4291-9260-52f1812c6b71-logs\") pod \"aa512688-fab8-4291-9260-52f1812c6b71\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.630392 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94ltn\" (UniqueName: \"kubernetes.io/projected/aa512688-fab8-4291-9260-52f1812c6b71-kube-api-access-94ltn\") pod \"aa512688-fab8-4291-9260-52f1812c6b71\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.630721 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-scripts\") pod \"aa512688-fab8-4291-9260-52f1812c6b71\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.634404 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-config-data\") pod \"aa512688-fab8-4291-9260-52f1812c6b71\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.634465 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"aa512688-fab8-4291-9260-52f1812c6b71\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.634760 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-747zf\" (UniqueName: \"kubernetes.io/projected/187135be-27c8-45e7-8a3c-ecd516507200-kube-api-access-747zf\") pod \"187135be-27c8-45e7-8a3c-ecd516507200\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.635014 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-config-data\") pod \"187135be-27c8-45e7-8a3c-ecd516507200\" (UID: \"187135be-27c8-45e7-8a3c-ecd516507200\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.639313 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-public-tls-certs\") pod \"aa512688-fab8-4291-9260-52f1812c6b71\" (UID: \"aa512688-fab8-4291-9260-52f1812c6b71\") " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.641641 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187135be-27c8-45e7-8a3c-ecd516507200-logs" (OuterVolumeSpecName: "logs") pod "187135be-27c8-45e7-8a3c-ecd516507200" (UID: "187135be-27c8-45e7-8a3c-ecd516507200"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.648172 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-config-data" (OuterVolumeSpecName: "config-data") pod "aa512688-fab8-4291-9260-52f1812c6b71" (UID: "aa512688-fab8-4291-9260-52f1812c6b71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.652857 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187135be-27c8-45e7-8a3c-ecd516507200-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "187135be-27c8-45e7-8a3c-ecd516507200" (UID: "187135be-27c8-45e7-8a3c-ecd516507200"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.653365 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa512688-fab8-4291-9260-52f1812c6b71-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "aa512688-fab8-4291-9260-52f1812c6b71" (UID: "aa512688-fab8-4291-9260-52f1812c6b71"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.653957 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f9cb7fbd9-gcq9k"] Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.654744 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa512688-fab8-4291-9260-52f1812c6b71-logs" (OuterVolumeSpecName: "logs") pod "aa512688-fab8-4291-9260-52f1812c6b71" (UID: "aa512688-fab8-4291-9260-52f1812c6b71"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.656585 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187135be-27c8-45e7-8a3c-ecd516507200-ceph" (OuterVolumeSpecName: "ceph") pod "187135be-27c8-45e7-8a3c-ecd516507200" (UID: "187135be-27c8-45e7-8a3c-ecd516507200"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.656744 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-scripts" (OuterVolumeSpecName: "scripts") pod "187135be-27c8-45e7-8a3c-ecd516507200" (UID: "187135be-27c8-45e7-8a3c-ecd516507200"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.658755 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa512688-fab8-4291-9260-52f1812c6b71-kube-api-access-94ltn" (OuterVolumeSpecName: "kube-api-access-94ltn") pod "aa512688-fab8-4291-9260-52f1812c6b71" (UID: "aa512688-fab8-4291-9260-52f1812c6b71"). InnerVolumeSpecName "kube-api-access-94ltn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.659937 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa512688-fab8-4291-9260-52f1812c6b71" (UID: "aa512688-fab8-4291-9260-52f1812c6b71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.665015 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "187135be-27c8-45e7-8a3c-ecd516507200" (UID: "187135be-27c8-45e7-8a3c-ecd516507200"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.667905 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187135be-27c8-45e7-8a3c-ecd516507200-kube-api-access-747zf" (OuterVolumeSpecName: "kube-api-access-747zf") pod "187135be-27c8-45e7-8a3c-ecd516507200" (UID: "187135be-27c8-45e7-8a3c-ecd516507200"). InnerVolumeSpecName "kube-api-access-747zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.674885 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "aa512688-fab8-4291-9260-52f1812c6b71" (UID: "aa512688-fab8-4291-9260-52f1812c6b71"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.683466 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.684331 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.692621 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa512688-fab8-4291-9260-52f1812c6b71-ceph" (OuterVolumeSpecName: "ceph") pod "aa512688-fab8-4291-9260-52f1812c6b71" (UID: "aa512688-fab8-4291-9260-52f1812c6b71"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.698468 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-scripts" (OuterVolumeSpecName: "scripts") pod "aa512688-fab8-4291-9260-52f1812c6b71" (UID: "aa512688-fab8-4291-9260-52f1812c6b71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.688747 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "187135be-27c8-45e7-8a3c-ecd516507200" (UID: "187135be-27c8-45e7-8a3c-ecd516507200"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.699214 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/187135be-27c8-45e7-8a3c-ecd516507200-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.699307 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-config-data" (OuterVolumeSpecName: "config-data") pod "187135be-27c8-45e7-8a3c-ecd516507200" (UID: "187135be-27c8-45e7-8a3c-ecd516507200"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.713920 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "187135be-27c8-45e7-8a3c-ecd516507200" (UID: "187135be-27c8-45e7-8a3c-ecd516507200"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.726648 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187135be-27c8-45e7-8a3c-ecd516507200-logs\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.730193 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "aa512688-fab8-4291-9260-52f1812c6b71" (UID: "aa512688-fab8-4291-9260-52f1812c6b71"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.736711 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.736750 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/187135be-27c8-45e7-8a3c-ecd516507200-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.736761 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/aa512688-fab8-4291-9260-52f1812c6b71-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.736770 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa512688-fab8-4291-9260-52f1812c6b71-logs\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.736781 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94ltn\" (UniqueName: \"kubernetes.io/projected/aa512688-fab8-4291-9260-52f1812c6b71-kube-api-access-94ltn\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.736792 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.736800 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-747zf\" (UniqueName: \"kubernetes.io/projected/187135be-27c8-45e7-8a3c-ecd516507200-kube-api-access-747zf\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.838486 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.838746 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aa512688-fab8-4291-9260-52f1812c6b71-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.838851 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.838936 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.839061 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.839215 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187135be-27c8-45e7-8a3c-ecd516507200-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.839306 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa512688-fab8-4291-9260-52f1812c6b71-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.893520 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.913557 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.926433 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-291a-account-create-update-jcsfx"] Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.942076 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.942104 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:39 crc kubenswrapper[4792]: I0309 10:00:39.954758 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-57k9s"] Mar 09 10:00:39 crc kubenswrapper[4792]: W0309 10:00:39.992881 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d8a53c4_679b_42ff_81a7_264b779e3486.slice/crio-a7f2d01f23736d8da677b6ea38067d44dc317ef8fef08979de7d9c3a8d903bc6 WatchSource:0}: Error finding container a7f2d01f23736d8da677b6ea38067d44dc317ef8fef08979de7d9c3a8d903bc6: Status 404 returned error can't find the container with id a7f2d01f23736d8da677b6ea38067d44dc317ef8fef08979de7d9c3a8d903bc6 Mar 09 10:00:40 crc kubenswrapper[4792]: E0309 10:00:40.103725 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod187135be_27c8_45e7_8a3c_ecd516507200.slice\": RecentStats: unable to find data in memory cache]" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.442474 4792 scope.go:117] "RemoveContainer" containerID="99ce20c6de306eeb3ba18702bf7780f93634cfb30b3a7af923b5c1e36f1cc492" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.467843 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796cbc75d5-fwljj" event={"ID":"2c175458-c03c-4901-a2ec-6250b13a461b","Type":"ContainerStarted","Data":"3ef1a7af4eccbbad21e71fd0d5613264137a65de7258212f846d7dea9f5a727c"} Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.469240 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-57k9s" event={"ID":"3823d2bd-f551-4107-a220-6e97655832e1","Type":"ContainerStarted","Data":"8c21817c12bb5b57cba154bf88e103c44b7bfd9171a8f0bdf4d4bd37aedc6a69"} Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.479461 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f9cb7fbd9-gcq9k" event={"ID":"8646be85-08e4-4f0d-9484-01456e453c36","Type":"ContainerStarted","Data":"6479a41ee22599d34c84e45233ec2770b8e4a73be9b7a0d768470c49371cc442"} Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.487361 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-291a-account-create-update-jcsfx" event={"ID":"6d8a53c4-679b-42ff-81a7-264b779e3486","Type":"ContainerStarted","Data":"a7f2d01f23736d8da677b6ea38067d44dc317ef8fef08979de7d9c3a8d903bc6"} Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.487419 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.487437 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.501164 4792 scope.go:117] "RemoveContainer" containerID="062349ef731f6f884f801e39e8e2a755629822568a0b2473e1c61a28b2fdc96e" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.628476 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.668989 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.727175 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.729174 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.745097 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-t8kmn" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.745409 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.746059 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.746281 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.817108 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.867743 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.882534 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.882574 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.882598 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.882630 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a06a0b7-4efd-480b-9530-99425153e278-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.882673 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.882736 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.882772 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a06a0b7-4efd-480b-9530-99425153e278-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.882788 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hdbr\" (UniqueName: \"kubernetes.io/projected/7a06a0b7-4efd-480b-9530-99425153e278-kube-api-access-9hdbr\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.882804 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a06a0b7-4efd-480b-9530-99425153e278-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.934103 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.944412 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.946277 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.951694 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.951914 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 10:00:40 crc kubenswrapper[4792]: I0309 10:00:40.953598 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:40.991686 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:40.991773 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a06a0b7-4efd-480b-9530-99425153e278-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:40.991801 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hdbr\" (UniqueName: \"kubernetes.io/projected/7a06a0b7-4efd-480b-9530-99425153e278-kube-api-access-9hdbr\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:40.991829 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a06a0b7-4efd-480b-9530-99425153e278-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:40.991893 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:40.991922 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:40.991956 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:40.992003 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a06a0b7-4efd-480b-9530-99425153e278-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:40.992515 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:40.997047 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a06a0b7-4efd-480b-9530-99425153e278-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:40.998919 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.002860 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a06a0b7-4efd-480b-9530-99425153e278-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.006313 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a06a0b7-4efd-480b-9530-99425153e278-ceph\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.023341 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.034525 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hdbr\" (UniqueName: \"kubernetes.io/projected/7a06a0b7-4efd-480b-9530-99425153e278-kube-api-access-9hdbr\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.054970 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.072233 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.098060 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.198981 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.199589 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1477887c-abbe-4df3-ac30-71aee75dce39-ceph\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.199724 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-scripts\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.200003 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.200167 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1477887c-abbe-4df3-ac30-71aee75dce39-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.200635 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.200794 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j69hm\" (UniqueName: \"kubernetes.io/projected/1477887c-abbe-4df3-ac30-71aee75dce39-kube-api-access-j69hm\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.200947 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1477887c-abbe-4df3-ac30-71aee75dce39-logs\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.201192 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-config-data\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.255983 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.330928 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.331282 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1477887c-abbe-4df3-ac30-71aee75dce39-ceph\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.341699 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-scripts\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.341787 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.341838 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1477887c-abbe-4df3-ac30-71aee75dce39-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.341941 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.341987 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j69hm\" (UniqueName: \"kubernetes.io/projected/1477887c-abbe-4df3-ac30-71aee75dce39-kube-api-access-j69hm\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.342005 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1477887c-abbe-4df3-ac30-71aee75dce39-logs\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.342081 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-config-data\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.344279 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1477887c-abbe-4df3-ac30-71aee75dce39-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.344926 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1477887c-abbe-4df3-ac30-71aee75dce39-logs\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.345881 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.355292 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1477887c-abbe-4df3-ac30-71aee75dce39-ceph\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.359858 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.391705 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.395213 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-scripts\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.399022 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-config-data\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.418564 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j69hm\" (UniqueName: \"kubernetes.io/projected/1477887c-abbe-4df3-ac30-71aee75dce39-kube-api-access-j69hm\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.464997 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.503630 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2eaee6b3-8397-430c-b799-7628762d1701","Type":"ContainerStarted","Data":"ee8d87a468b24d7976cd9042c30fb553c5ad32e60da47cf9e66326fa6ede9803"} Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.503938 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"2eaee6b3-8397-430c-b799-7628762d1701","Type":"ContainerStarted","Data":"ddd9b185b12f6366a988281824b7a20f40cb62ee11c04ccfc1e88a8778f85521"} Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.508807 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-291a-account-create-update-jcsfx" event={"ID":"6d8a53c4-679b-42ff-81a7-264b779e3486","Type":"ContainerStarted","Data":"fcca30b03fa1385267a0cc8bc61bec74dd50819cddeafafab2c3e05fbf997918"} Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.514924 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.516533 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"61d989fe-045d-4c58-b660-f9d0e1a482f9","Type":"ContainerStarted","Data":"e05a1486e6167afa3e88173c1ae265c05849f3005fd2d9b6ceb5ba5b105e1af9"} Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.521061 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-57k9s" event={"ID":"3823d2bd-f551-4107-a220-6e97655832e1","Type":"ContainerStarted","Data":"bbb5c1a656224ced1e1465fcf7fb0e58297f94be6c67b8e9c4f1a76ff4a25bbd"} Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.532737 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.494138432 podStartE2EDuration="4.532710356s" podCreationTimestamp="2026-03-09 10:00:37 +0000 UTC" firstStartedPulling="2026-03-09 10:00:39.191336797 +0000 UTC m=+3204.221537549" lastFinishedPulling="2026-03-09 10:00:40.229908721 +0000 UTC m=+3205.260109473" observedRunningTime="2026-03-09 10:00:41.525702208 +0000 UTC m=+3206.555902970" watchObservedRunningTime="2026-03-09 10:00:41.532710356 +0000 UTC m=+3206.562911118" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.575892 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-57k9s" podStartSLOduration=3.575872673 podStartE2EDuration="3.575872673s" podCreationTimestamp="2026-03-09 10:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:41.572528149 +0000 UTC m=+3206.602728901" watchObservedRunningTime="2026-03-09 10:00:41.575872673 +0000 UTC m=+3206.606073425" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.583362 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.681256 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="187135be-27c8-45e7-8a3c-ecd516507200" path="/var/lib/kubelet/pods/187135be-27c8-45e7-8a3c-ecd516507200/volumes" Mar 09 10:00:41 crc kubenswrapper[4792]: I0309 10:00:41.681931 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa512688-fab8-4291-9260-52f1812c6b71" path="/var/lib/kubelet/pods/aa512688-fab8-4291-9260-52f1812c6b71/volumes" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.190570 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-291a-account-create-update-jcsfx" podStartSLOduration=4.190547866 podStartE2EDuration="4.190547866s" podCreationTimestamp="2026-03-09 10:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:41.60091755 +0000 UTC m=+3206.631118302" watchObservedRunningTime="2026-03-09 10:00:42.190547866 +0000 UTC m=+3207.220748618" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.205373 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f9cb7fbd9-gcq9k"] Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.278631 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-85fd9d548-2q98q"] Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.280364 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.291599 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.321559 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.371442 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85fd9d548-2q98q"] Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.387189 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-horizon-secret-key\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.387399 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24426baa-19e0-4ac0-87b6-0f5a824de578-config-data\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.387503 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kblf\" (UniqueName: \"kubernetes.io/projected/24426baa-19e0-4ac0-87b6-0f5a824de578-kube-api-access-9kblf\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.387576 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-combined-ca-bundle\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.387624 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-horizon-tls-certs\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.387648 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24426baa-19e0-4ac0-87b6-0f5a824de578-logs\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.387758 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24426baa-19e0-4ac0-87b6-0f5a824de578-scripts\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.439158 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-796cbc75d5-fwljj"] Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.459453 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54c85f748d-wxdlf"] Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.465098 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.481713 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54c85f748d-wxdlf"] Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.492534 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-horizon-tls-certs\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.492769 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24426baa-19e0-4ac0-87b6-0f5a824de578-config-data\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.492870 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kblf\" (UniqueName: \"kubernetes.io/projected/24426baa-19e0-4ac0-87b6-0f5a824de578-kube-api-access-9kblf\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.492989 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-scripts\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.493057 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-combined-ca-bundle\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.493150 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwqpm\" (UniqueName: \"kubernetes.io/projected/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-kube-api-access-dwqpm\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.493233 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-horizon-tls-certs\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.493301 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-horizon-secret-key\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.493414 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24426baa-19e0-4ac0-87b6-0f5a824de578-logs\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.493500 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-logs\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.493600 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24426baa-19e0-4ac0-87b6-0f5a824de578-scripts\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.493698 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-combined-ca-bundle\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.493788 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-config-data\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.493872 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-horizon-secret-key\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.494895 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24426baa-19e0-4ac0-87b6-0f5a824de578-config-data\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.499536 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24426baa-19e0-4ac0-87b6-0f5a824de578-scripts\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.499766 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24426baa-19e0-4ac0-87b6-0f5a824de578-logs\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.512844 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-horizon-tls-certs\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.513759 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-combined-ca-bundle\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.518694 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-horizon-secret-key\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.588057 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kblf\" (UniqueName: \"kubernetes.io/projected/24426baa-19e0-4ac0-87b6-0f5a824de578-kube-api-access-9kblf\") pod \"horizon-85fd9d548-2q98q\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.588997 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.595236 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-horizon-tls-certs\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.595325 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-scripts\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.595348 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwqpm\" (UniqueName: \"kubernetes.io/projected/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-kube-api-access-dwqpm\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.595376 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-horizon-secret-key\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.595411 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-logs\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.595451 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-combined-ca-bundle\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.595491 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-config-data\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.601539 4792 generic.go:334] "Generic (PLEG): container finished" podID="3823d2bd-f551-4107-a220-6e97655832e1" containerID="bbb5c1a656224ced1e1465fcf7fb0e58297f94be6c67b8e9c4f1a76ff4a25bbd" exitCode=0 Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.601630 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-57k9s" event={"ID":"3823d2bd-f551-4107-a220-6e97655832e1","Type":"ContainerDied","Data":"bbb5c1a656224ced1e1465fcf7fb0e58297f94be6c67b8e9c4f1a76ff4a25bbd"} Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.602016 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-config-data\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.602411 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-logs\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.602656 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-horizon-tls-certs\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.602959 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-scripts\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.607929 4792 generic.go:334] "Generic (PLEG): container finished" podID="6d8a53c4-679b-42ff-81a7-264b779e3486" containerID="fcca30b03fa1385267a0cc8bc61bec74dd50819cddeafafab2c3e05fbf997918" exitCode=0 Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.608500 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-291a-account-create-update-jcsfx" event={"ID":"6d8a53c4-679b-42ff-81a7-264b779e3486","Type":"ContainerDied","Data":"fcca30b03fa1385267a0cc8bc61bec74dd50819cddeafafab2c3e05fbf997918"} Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.612134 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-horizon-secret-key\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.615561 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-combined-ca-bundle\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.631832 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"61d989fe-045d-4c58-b660-f9d0e1a482f9","Type":"ContainerStarted","Data":"bf7d86422c4efe350d7e253e91831fd78fc66d6327aa44b286a32db6a2754a1c"} Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.652892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwqpm\" (UniqueName: \"kubernetes.io/projected/d028a70e-dd9d-4b38-bb18-4cd55cd002fe-kube-api-access-dwqpm\") pod \"horizon-54c85f748d-wxdlf\" (UID: \"d028a70e-dd9d-4b38-bb18-4cd55cd002fe\") " pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.660864 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.726580 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=4.521122211 podStartE2EDuration="5.726557319s" podCreationTimestamp="2026-03-09 10:00:37 +0000 UTC" firstStartedPulling="2026-03-09 10:00:39.039967847 +0000 UTC m=+3204.070168599" lastFinishedPulling="2026-03-09 10:00:40.245402955 +0000 UTC m=+3205.275603707" observedRunningTime="2026-03-09 10:00:42.708377257 +0000 UTC m=+3207.738578009" watchObservedRunningTime="2026-03-09 10:00:42.726557319 +0000 UTC m=+3207.756758071" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.743415 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.825312 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:00:42 crc kubenswrapper[4792]: I0309 10:00:42.876110 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:43 crc kubenswrapper[4792]: I0309 10:00:43.028432 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 09 10:00:43 crc kubenswrapper[4792]: I0309 10:00:43.324144 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:43 crc kubenswrapper[4792]: I0309 10:00:43.433182 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85fd9d548-2q98q"] Mar 09 10:00:43 crc kubenswrapper[4792]: W0309 10:00:43.445417 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24426baa_19e0_4ac0_87b6_0f5a824de578.slice/crio-7ada0d813d0028ce6cdd2d57ef4866a86ced1f5fcb0cdc777fe1e07c9020b9a1 WatchSource:0}: Error finding container 7ada0d813d0028ce6cdd2d57ef4866a86ced1f5fcb0cdc777fe1e07c9020b9a1: Status 404 returned error can't find the container with id 7ada0d813d0028ce6cdd2d57ef4866a86ced1f5fcb0cdc777fe1e07c9020b9a1 Mar 09 10:00:43 crc kubenswrapper[4792]: I0309 10:00:43.656303 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54c85f748d-wxdlf"] Mar 09 10:00:43 crc kubenswrapper[4792]: I0309 10:00:43.676615 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fd9d548-2q98q" event={"ID":"24426baa-19e0-4ac0-87b6-0f5a824de578","Type":"ContainerStarted","Data":"7ada0d813d0028ce6cdd2d57ef4866a86ced1f5fcb0cdc777fe1e07c9020b9a1"} Mar 09 10:00:43 crc kubenswrapper[4792]: I0309 10:00:43.679201 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a06a0b7-4efd-480b-9530-99425153e278","Type":"ContainerStarted","Data":"1f439fc34c2bfe833df0b2cf2e80092d46acfa606c659730461704e2cf851faf"} Mar 09 10:00:43 crc kubenswrapper[4792]: I0309 10:00:43.681955 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1477887c-abbe-4df3-ac30-71aee75dce39","Type":"ContainerStarted","Data":"19eb4afd04a883cae13511718fb22a1271a83669406b48cd862a31887f270b04"} Mar 09 10:00:43 crc kubenswrapper[4792]: W0309 10:00:43.685481 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd028a70e_dd9d_4b38_bb18_4cd55cd002fe.slice/crio-138cd5adb891bca1b8b98275fa37036994c69724144c048683167ffb5876ed4e WatchSource:0}: Error finding container 138cd5adb891bca1b8b98275fa37036994c69724144c048683167ffb5876ed4e: Status 404 returned error can't find the container with id 138cd5adb891bca1b8b98275fa37036994c69724144c048683167ffb5876ed4e Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.417484 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-57k9s" Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.539831 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-291a-account-create-update-jcsfx" Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.604777 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3823d2bd-f551-4107-a220-6e97655832e1-operator-scripts\") pod \"3823d2bd-f551-4107-a220-6e97655832e1\" (UID: \"3823d2bd-f551-4107-a220-6e97655832e1\") " Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.604961 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx5zn\" (UniqueName: \"kubernetes.io/projected/3823d2bd-f551-4107-a220-6e97655832e1-kube-api-access-rx5zn\") pod \"3823d2bd-f551-4107-a220-6e97655832e1\" (UID: \"3823d2bd-f551-4107-a220-6e97655832e1\") " Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.607375 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3823d2bd-f551-4107-a220-6e97655832e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3823d2bd-f551-4107-a220-6e97655832e1" (UID: "3823d2bd-f551-4107-a220-6e97655832e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.639531 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3823d2bd-f551-4107-a220-6e97655832e1-kube-api-access-rx5zn" (OuterVolumeSpecName: "kube-api-access-rx5zn") pod "3823d2bd-f551-4107-a220-6e97655832e1" (UID: "3823d2bd-f551-4107-a220-6e97655832e1"). InnerVolumeSpecName "kube-api-access-rx5zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.708646 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6hkp\" (UniqueName: \"kubernetes.io/projected/6d8a53c4-679b-42ff-81a7-264b779e3486-kube-api-access-b6hkp\") pod \"6d8a53c4-679b-42ff-81a7-264b779e3486\" (UID: \"6d8a53c4-679b-42ff-81a7-264b779e3486\") " Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.709114 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d8a53c4-679b-42ff-81a7-264b779e3486-operator-scripts\") pod \"6d8a53c4-679b-42ff-81a7-264b779e3486\" (UID: \"6d8a53c4-679b-42ff-81a7-264b779e3486\") " Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.709550 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx5zn\" (UniqueName: \"kubernetes.io/projected/3823d2bd-f551-4107-a220-6e97655832e1-kube-api-access-rx5zn\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.709563 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3823d2bd-f551-4107-a220-6e97655832e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.711491 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d8a53c4-679b-42ff-81a7-264b779e3486-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d8a53c4-679b-42ff-81a7-264b779e3486" (UID: "6d8a53c4-679b-42ff-81a7-264b779e3486"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.720308 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d8a53c4-679b-42ff-81a7-264b779e3486-kube-api-access-b6hkp" (OuterVolumeSpecName: "kube-api-access-b6hkp") pod "6d8a53c4-679b-42ff-81a7-264b779e3486" (UID: "6d8a53c4-679b-42ff-81a7-264b779e3486"). InnerVolumeSpecName "kube-api-access-b6hkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.720574 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c85f748d-wxdlf" event={"ID":"d028a70e-dd9d-4b38-bb18-4cd55cd002fe","Type":"ContainerStarted","Data":"138cd5adb891bca1b8b98275fa37036994c69724144c048683167ffb5876ed4e"} Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.755567 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1477887c-abbe-4df3-ac30-71aee75dce39","Type":"ContainerStarted","Data":"09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616"} Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.758501 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-57k9s" event={"ID":"3823d2bd-f551-4107-a220-6e97655832e1","Type":"ContainerDied","Data":"8c21817c12bb5b57cba154bf88e103c44b7bfd9171a8f0bdf4d4bd37aedc6a69"} Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.758536 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c21817c12bb5b57cba154bf88e103c44b7bfd9171a8f0bdf4d4bd37aedc6a69" Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.758600 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-57k9s" Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.777521 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-291a-account-create-update-jcsfx" Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.784296 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-291a-account-create-update-jcsfx" event={"ID":"6d8a53c4-679b-42ff-81a7-264b779e3486","Type":"ContainerDied","Data":"a7f2d01f23736d8da677b6ea38067d44dc317ef8fef08979de7d9c3a8d903bc6"} Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.787932 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7f2d01f23736d8da677b6ea38067d44dc317ef8fef08979de7d9c3a8d903bc6" Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.811185 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6hkp\" (UniqueName: \"kubernetes.io/projected/6d8a53c4-679b-42ff-81a7-264b779e3486-kube-api-access-b6hkp\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:44 crc kubenswrapper[4792]: I0309 10:00:44.811221 4792 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d8a53c4-679b-42ff-81a7-264b779e3486-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:45 crc kubenswrapper[4792]: I0309 10:00:45.813282 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a06a0b7-4efd-480b-9530-99425153e278","Type":"ContainerStarted","Data":"e99b74a3cfa2c3ac75885b676aa9295553717023e113e1490a11968305b44022"} Mar 09 10:00:46 crc kubenswrapper[4792]: I0309 10:00:46.662790 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 10:00:46 crc kubenswrapper[4792]: E0309 10:00:46.663744 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:00:46 crc kubenswrapper[4792]: I0309 10:00:46.851866 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a06a0b7-4efd-480b-9530-99425153e278","Type":"ContainerStarted","Data":"2e4d6a0f542a04616ea68fef023f06f23c4c5ae8fee25cf0d8ebcbd25a16d9d9"} Mar 09 10:00:46 crc kubenswrapper[4792]: I0309 10:00:46.852005 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7a06a0b7-4efd-480b-9530-99425153e278" containerName="glance-log" containerID="cri-o://e99b74a3cfa2c3ac75885b676aa9295553717023e113e1490a11968305b44022" gracePeriod=30 Mar 09 10:00:46 crc kubenswrapper[4792]: I0309 10:00:46.852014 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7a06a0b7-4efd-480b-9530-99425153e278" containerName="glance-httpd" containerID="cri-o://2e4d6a0f542a04616ea68fef023f06f23c4c5ae8fee25cf0d8ebcbd25a16d9d9" gracePeriod=30 Mar 09 10:00:46 crc kubenswrapper[4792]: I0309 10:00:46.854639 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1477887c-abbe-4df3-ac30-71aee75dce39","Type":"ContainerStarted","Data":"aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb"} Mar 09 10:00:46 crc kubenswrapper[4792]: I0309 10:00:46.854775 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1477887c-abbe-4df3-ac30-71aee75dce39" containerName="glance-log" containerID="cri-o://09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616" gracePeriod=30 Mar 09 10:00:46 crc kubenswrapper[4792]: I0309 10:00:46.854885 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1477887c-abbe-4df3-ac30-71aee75dce39" containerName="glance-httpd" containerID="cri-o://aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb" gracePeriod=30 Mar 09 10:00:46 crc kubenswrapper[4792]: I0309 10:00:46.918288 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.9182670680000005 podStartE2EDuration="6.918267068s" podCreationTimestamp="2026-03-09 10:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:46.881346629 +0000 UTC m=+3211.911547391" watchObservedRunningTime="2026-03-09 10:00:46.918267068 +0000 UTC m=+3211.948467820" Mar 09 10:00:46 crc kubenswrapper[4792]: I0309 10:00:46.936461 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.9364412699999995 podStartE2EDuration="6.93644127s" podCreationTimestamp="2026-03-09 10:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:46.919754486 +0000 UTC m=+3211.949955248" watchObservedRunningTime="2026-03-09 10:00:46.93644127 +0000 UTC m=+3211.966642022" Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.804905 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.880412 4792 generic.go:334] "Generic (PLEG): container finished" podID="1477887c-abbe-4df3-ac30-71aee75dce39" containerID="aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb" exitCode=0 Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.880465 4792 generic.go:334] "Generic (PLEG): container finished" podID="1477887c-abbe-4df3-ac30-71aee75dce39" containerID="09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616" exitCode=143 Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.880529 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1477887c-abbe-4df3-ac30-71aee75dce39","Type":"ContainerDied","Data":"aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb"} Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.880561 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1477887c-abbe-4df3-ac30-71aee75dce39","Type":"ContainerDied","Data":"09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616"} Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.880573 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1477887c-abbe-4df3-ac30-71aee75dce39","Type":"ContainerDied","Data":"19eb4afd04a883cae13511718fb22a1271a83669406b48cd862a31887f270b04"} Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.880591 4792 scope.go:117] "RemoveContainer" containerID="aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb" Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.880758 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.895005 4792 generic.go:334] "Generic (PLEG): container finished" podID="7a06a0b7-4efd-480b-9530-99425153e278" containerID="2e4d6a0f542a04616ea68fef023f06f23c4c5ae8fee25cf0d8ebcbd25a16d9d9" exitCode=0 Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.895042 4792 generic.go:334] "Generic (PLEG): container finished" podID="7a06a0b7-4efd-480b-9530-99425153e278" containerID="e99b74a3cfa2c3ac75885b676aa9295553717023e113e1490a11968305b44022" exitCode=143 Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.895085 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a06a0b7-4efd-480b-9530-99425153e278","Type":"ContainerDied","Data":"2e4d6a0f542a04616ea68fef023f06f23c4c5ae8fee25cf0d8ebcbd25a16d9d9"} Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.895118 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a06a0b7-4efd-480b-9530-99425153e278","Type":"ContainerDied","Data":"e99b74a3cfa2c3ac75885b676aa9295553717023e113e1490a11968305b44022"} Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.900129 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.902121 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-scripts\") pod \"1477887c-abbe-4df3-ac30-71aee75dce39\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.902207 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-combined-ca-bundle\") pod \"1477887c-abbe-4df3-ac30-71aee75dce39\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.902348 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1477887c-abbe-4df3-ac30-71aee75dce39-httpd-run\") pod \"1477887c-abbe-4df3-ac30-71aee75dce39\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.902414 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1477887c-abbe-4df3-ac30-71aee75dce39-ceph\") pod \"1477887c-abbe-4df3-ac30-71aee75dce39\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.902436 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"1477887c-abbe-4df3-ac30-71aee75dce39\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.902536 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-public-tls-certs\") pod \"1477887c-abbe-4df3-ac30-71aee75dce39\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.902607 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1477887c-abbe-4df3-ac30-71aee75dce39-logs\") pod \"1477887c-abbe-4df3-ac30-71aee75dce39\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.902643 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-config-data\") pod \"1477887c-abbe-4df3-ac30-71aee75dce39\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.902692 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j69hm\" (UniqueName: \"kubernetes.io/projected/1477887c-abbe-4df3-ac30-71aee75dce39-kube-api-access-j69hm\") pod \"1477887c-abbe-4df3-ac30-71aee75dce39\" (UID: \"1477887c-abbe-4df3-ac30-71aee75dce39\") " Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.903728 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1477887c-abbe-4df3-ac30-71aee75dce39-logs" (OuterVolumeSpecName: "logs") pod "1477887c-abbe-4df3-ac30-71aee75dce39" (UID: "1477887c-abbe-4df3-ac30-71aee75dce39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.903853 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1477887c-abbe-4df3-ac30-71aee75dce39-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1477887c-abbe-4df3-ac30-71aee75dce39" (UID: "1477887c-abbe-4df3-ac30-71aee75dce39"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.913335 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-scripts" (OuterVolumeSpecName: "scripts") pod "1477887c-abbe-4df3-ac30-71aee75dce39" (UID: "1477887c-abbe-4df3-ac30-71aee75dce39"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.915325 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "1477887c-abbe-4df3-ac30-71aee75dce39" (UID: "1477887c-abbe-4df3-ac30-71aee75dce39"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.917709 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1477887c-abbe-4df3-ac30-71aee75dce39-kube-api-access-j69hm" (OuterVolumeSpecName: "kube-api-access-j69hm") pod "1477887c-abbe-4df3-ac30-71aee75dce39" (UID: "1477887c-abbe-4df3-ac30-71aee75dce39"). InnerVolumeSpecName "kube-api-access-j69hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.939585 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1477887c-abbe-4df3-ac30-71aee75dce39-ceph" (OuterVolumeSpecName: "ceph") pod "1477887c-abbe-4df3-ac30-71aee75dce39" (UID: "1477887c-abbe-4df3-ac30-71aee75dce39"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.966848 4792 scope.go:117] "RemoveContainer" containerID="09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616" Mar 09 10:00:47 crc kubenswrapper[4792]: I0309 10:00:47.987743 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1477887c-abbe-4df3-ac30-71aee75dce39" (UID: "1477887c-abbe-4df3-ac30-71aee75dce39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.005445 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"7a06a0b7-4efd-480b-9530-99425153e278\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.006018 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a06a0b7-4efd-480b-9530-99425153e278-httpd-run\") pod \"7a06a0b7-4efd-480b-9530-99425153e278\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.006169 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-internal-tls-certs\") pod \"7a06a0b7-4efd-480b-9530-99425153e278\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.006305 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-scripts\") pod \"7a06a0b7-4efd-480b-9530-99425153e278\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.006445 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-config-data\") pod \"7a06a0b7-4efd-480b-9530-99425153e278\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.006673 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a06a0b7-4efd-480b-9530-99425153e278-logs\") pod \"7a06a0b7-4efd-480b-9530-99425153e278\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.006880 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hdbr\" (UniqueName: \"kubernetes.io/projected/7a06a0b7-4efd-480b-9530-99425153e278-kube-api-access-9hdbr\") pod \"7a06a0b7-4efd-480b-9530-99425153e278\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.007613 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a06a0b7-4efd-480b-9530-99425153e278-ceph\") pod \"7a06a0b7-4efd-480b-9530-99425153e278\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.007758 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-combined-ca-bundle\") pod \"7a06a0b7-4efd-480b-9530-99425153e278\" (UID: \"7a06a0b7-4efd-480b-9530-99425153e278\") " Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.008555 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.008736 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.008887 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1477887c-abbe-4df3-ac30-71aee75dce39-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.008994 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1477887c-abbe-4df3-ac30-71aee75dce39-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.009125 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.009241 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1477887c-abbe-4df3-ac30-71aee75dce39-logs\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.009348 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j69hm\" (UniqueName: \"kubernetes.io/projected/1477887c-abbe-4df3-ac30-71aee75dce39-kube-api-access-j69hm\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.011778 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a06a0b7-4efd-480b-9530-99425153e278-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7a06a0b7-4efd-480b-9530-99425153e278" (UID: "7a06a0b7-4efd-480b-9530-99425153e278"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.013622 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "7a06a0b7-4efd-480b-9530-99425153e278" (UID: "7a06a0b7-4efd-480b-9530-99425153e278"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.013688 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a06a0b7-4efd-480b-9530-99425153e278-logs" (OuterVolumeSpecName: "logs") pod "7a06a0b7-4efd-480b-9530-99425153e278" (UID: "7a06a0b7-4efd-480b-9530-99425153e278"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.022103 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-scripts" (OuterVolumeSpecName: "scripts") pod "7a06a0b7-4efd-480b-9530-99425153e278" (UID: "7a06a0b7-4efd-480b-9530-99425153e278"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.024310 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a06a0b7-4efd-480b-9530-99425153e278-kube-api-access-9hdbr" (OuterVolumeSpecName: "kube-api-access-9hdbr") pod "7a06a0b7-4efd-480b-9530-99425153e278" (UID: "7a06a0b7-4efd-480b-9530-99425153e278"). InnerVolumeSpecName "kube-api-access-9hdbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.033673 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a06a0b7-4efd-480b-9530-99425153e278-ceph" (OuterVolumeSpecName: "ceph") pod "7a06a0b7-4efd-480b-9530-99425153e278" (UID: "7a06a0b7-4efd-480b-9530-99425153e278"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.061225 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-config-data" (OuterVolumeSpecName: "config-data") pod "1477887c-abbe-4df3-ac30-71aee75dce39" (UID: "1477887c-abbe-4df3-ac30-71aee75dce39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.078296 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a06a0b7-4efd-480b-9530-99425153e278" (UID: "7a06a0b7-4efd-480b-9530-99425153e278"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.081446 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.091744 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7a06a0b7-4efd-480b-9530-99425153e278" (UID: "7a06a0b7-4efd-480b-9530-99425153e278"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.111577 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a06a0b7-4efd-480b-9530-99425153e278-logs\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.111614 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.111623 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hdbr\" (UniqueName: \"kubernetes.io/projected/7a06a0b7-4efd-480b-9530-99425153e278-kube-api-access-9hdbr\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.111636 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a06a0b7-4efd-480b-9530-99425153e278-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.111645 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.111665 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.111673 4792 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a06a0b7-4efd-480b-9530-99425153e278-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.111683 4792 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.111691 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.111699 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.170578 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.186652 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1477887c-abbe-4df3-ac30-71aee75dce39" (UID: "1477887c-abbe-4df3-ac30-71aee75dce39"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.201769 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.215142 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-config-data" (OuterVolumeSpecName: "config-data") pod "7a06a0b7-4efd-480b-9530-99425153e278" (UID: "7a06a0b7-4efd-480b-9530-99425153e278"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.220973 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.221022 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a06a0b7-4efd-480b-9530-99425153e278-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.221035 4792 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1477887c-abbe-4df3-ac30-71aee75dce39-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.318292 4792 scope.go:117] "RemoveContainer" containerID="aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb" Mar 09 10:00:48 crc kubenswrapper[4792]: E0309 10:00:48.319136 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb\": container with ID starting with aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb not found: ID does not exist" containerID="aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.319168 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb"} err="failed to get container status \"aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb\": rpc error: code = NotFound desc = could not find container \"aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb\": container with ID starting with aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb not found: ID does not exist" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.319186 4792 scope.go:117] "RemoveContainer" containerID="09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616" Mar 09 10:00:48 crc kubenswrapper[4792]: E0309 10:00:48.319483 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616\": container with ID starting with 09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616 not found: ID does not exist" containerID="09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.319507 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616"} err="failed to get container status \"09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616\": rpc error: code = NotFound desc = could not find container \"09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616\": container with ID starting with 09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616 not found: ID does not exist" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.319524 4792 scope.go:117] "RemoveContainer" containerID="aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.319756 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb"} err="failed to get container status \"aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb\": rpc error: code = NotFound desc = could not find container \"aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb\": container with ID starting with aa1f7a93aaf30279398684d390a45b8d8123d4808f5f14a77f4fa8cecfb18dbb not found: ID does not exist" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.319769 4792 scope.go:117] "RemoveContainer" containerID="09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.319959 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616"} err="failed to get container status \"09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616\": rpc error: code = NotFound desc = could not find container \"09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616\": container with ID starting with 09ff8ceaf581582eb1b59711a2e5f47c2dc5074668437dc94d32c83d61187616 not found: ID does not exist" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.350188 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.525922 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.554188 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.583650 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:48 crc kubenswrapper[4792]: E0309 10:00:48.584097 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a06a0b7-4efd-480b-9530-99425153e278" containerName="glance-httpd" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.584116 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a06a0b7-4efd-480b-9530-99425153e278" containerName="glance-httpd" Mar 09 10:00:48 crc kubenswrapper[4792]: E0309 10:00:48.584138 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1477887c-abbe-4df3-ac30-71aee75dce39" containerName="glance-httpd" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.584146 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1477887c-abbe-4df3-ac30-71aee75dce39" containerName="glance-httpd" Mar 09 10:00:48 crc kubenswrapper[4792]: E0309 10:00:48.584172 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8a53c4-679b-42ff-81a7-264b779e3486" containerName="mariadb-account-create-update" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.584178 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8a53c4-679b-42ff-81a7-264b779e3486" containerName="mariadb-account-create-update" Mar 09 10:00:48 crc kubenswrapper[4792]: E0309 10:00:48.584190 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3823d2bd-f551-4107-a220-6e97655832e1" containerName="mariadb-database-create" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.584195 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3823d2bd-f551-4107-a220-6e97655832e1" containerName="mariadb-database-create" Mar 09 10:00:48 crc kubenswrapper[4792]: E0309 10:00:48.584208 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a06a0b7-4efd-480b-9530-99425153e278" containerName="glance-log" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.584215 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a06a0b7-4efd-480b-9530-99425153e278" containerName="glance-log" Mar 09 10:00:48 crc kubenswrapper[4792]: E0309 10:00:48.584229 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1477887c-abbe-4df3-ac30-71aee75dce39" containerName="glance-log" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.584237 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="1477887c-abbe-4df3-ac30-71aee75dce39" containerName="glance-log" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.584432 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a06a0b7-4efd-480b-9530-99425153e278" containerName="glance-log" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.584443 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8a53c4-679b-42ff-81a7-264b779e3486" containerName="mariadb-account-create-update" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.584457 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1477887c-abbe-4df3-ac30-71aee75dce39" containerName="glance-httpd" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.584466 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3823d2bd-f551-4107-a220-6e97655832e1" containerName="mariadb-database-create" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.584477 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a06a0b7-4efd-480b-9530-99425153e278" containerName="glance-httpd" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.584487 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="1477887c-abbe-4df3-ac30-71aee75dce39" containerName="glance-log" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.585517 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.588799 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.589043 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.610041 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.647135 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.647231 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-config-data\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.647293 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-logs\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.647338 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.647379 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-ceph\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.647472 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.647537 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-scripts\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.647575 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.647601 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6k4l\" (UniqueName: \"kubernetes.io/projected/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-kube-api-access-z6k4l\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.749300 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.749353 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-config-data\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.749396 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-logs\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.749426 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.749453 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-ceph\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.749534 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.749576 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-scripts\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.749598 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.749614 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6k4l\" (UniqueName: \"kubernetes.io/projected/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-kube-api-access-z6k4l\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.752263 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.753164 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-logs\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.757640 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.758223 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.759165 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-ceph\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.759648 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-config-data\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.760842 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.765587 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-scripts\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.768275 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6k4l\" (UniqueName: \"kubernetes.io/projected/2fc0a824-2dfc-436e-ad6e-c0751afcb61f-kube-api-access-z6k4l\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.796024 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"2fc0a824-2dfc-436e-ad6e-c0751afcb61f\") " pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.914919 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a06a0b7-4efd-480b-9530-99425153e278","Type":"ContainerDied","Data":"1f439fc34c2bfe833df0b2cf2e80092d46acfa606c659730461704e2cf851faf"} Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.914980 4792 scope.go:117] "RemoveContainer" containerID="2e4d6a0f542a04616ea68fef023f06f23c4c5ae8fee25cf0d8ebcbd25a16d9d9" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.915274 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.929263 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.974104 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:48 crc kubenswrapper[4792]: I0309 10:00:48.995456 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.020727 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.022466 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.030915 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.033289 4792 scope.go:117] "RemoveContainer" containerID="e99b74a3cfa2c3ac75885b676aa9295553717023e113e1490a11968305b44022" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.035428 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.037856 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.066979 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.070787 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.071125 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.071337 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.071462 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.072921 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbcmj\" (UniqueName: \"kubernetes.io/projected/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-kube-api-access-qbcmj\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.073176 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.073263 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.073359 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.175351 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.175407 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.175437 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.175504 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.175554 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.175728 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbcmj\" (UniqueName: \"kubernetes.io/projected/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-kube-api-access-qbcmj\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.175908 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.175973 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.176035 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.177340 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.178253 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.178360 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.189315 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.189557 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.196466 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.201461 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.208788 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.234487 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.238324 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbcmj\" (UniqueName: \"kubernetes.io/projected/5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c-kube-api-access-qbcmj\") pod \"glance-default-internal-api-0\" (UID: \"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c\") " pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.362874 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-vll6l"] Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.365163 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.371915 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-zfj99" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.371939 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.391369 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-vll6l"] Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.428727 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-job-config-data\") pod \"manila-db-sync-vll6l\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.428835 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-combined-ca-bundle\") pod \"manila-db-sync-vll6l\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.428867 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkfj5\" (UniqueName: \"kubernetes.io/projected/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-kube-api-access-zkfj5\") pod \"manila-db-sync-vll6l\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.428939 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-config-data\") pod \"manila-db-sync-vll6l\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.456856 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.531480 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-job-config-data\") pod \"manila-db-sync-vll6l\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.531606 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-combined-ca-bundle\") pod \"manila-db-sync-vll6l\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.531640 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkfj5\" (UniqueName: \"kubernetes.io/projected/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-kube-api-access-zkfj5\") pod \"manila-db-sync-vll6l\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.531752 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-config-data\") pod \"manila-db-sync-vll6l\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.538035 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-job-config-data\") pod \"manila-db-sync-vll6l\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.538687 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-config-data\") pod \"manila-db-sync-vll6l\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.541815 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-combined-ca-bundle\") pod \"manila-db-sync-vll6l\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.562831 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkfj5\" (UniqueName: \"kubernetes.io/projected/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-kube-api-access-zkfj5\") pod \"manila-db-sync-vll6l\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.680444 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1477887c-abbe-4df3-ac30-71aee75dce39" path="/var/lib/kubelet/pods/1477887c-abbe-4df3-ac30-71aee75dce39/volumes" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.687498 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a06a0b7-4efd-480b-9530-99425153e278" path="/var/lib/kubelet/pods/7a06a0b7-4efd-480b-9530-99425153e278/volumes" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.712782 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-vll6l" Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.810555 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 10:00:49 crc kubenswrapper[4792]: I0309 10:00:49.958014 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2fc0a824-2dfc-436e-ad6e-c0751afcb61f","Type":"ContainerStarted","Data":"2343f3417e3f23ee42a21d168b7b92d82eb5583dd54813ad2ef955df60ba29b1"} Mar 09 10:00:50 crc kubenswrapper[4792]: I0309 10:00:50.245488 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 10:00:50 crc kubenswrapper[4792]: I0309 10:00:50.474598 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-vll6l"] Mar 09 10:00:50 crc kubenswrapper[4792]: I0309 10:00:50.985496 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2fc0a824-2dfc-436e-ad6e-c0751afcb61f","Type":"ContainerStarted","Data":"59310694da34b3800de6a474e7547c49a6bc93372a1b275e80a0e7ed2b49019d"} Mar 09 10:00:50 crc kubenswrapper[4792]: I0309 10:00:50.987462 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c","Type":"ContainerStarted","Data":"de9ef8565c351e15ffa3f91e636733ecbf863a077f72d2db12ff4c906c01555f"} Mar 09 10:00:58 crc kubenswrapper[4792]: I0309 10:00:58.083613 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-vll6l" event={"ID":"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6","Type":"ContainerStarted","Data":"d0763feda1de40455373b07c413f9938eaf9d4c54d46419cb4c3749270c786df"} Mar 09 10:00:58 crc kubenswrapper[4792]: I0309 10:00:58.094358 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f9cb7fbd9-gcq9k" event={"ID":"8646be85-08e4-4f0d-9484-01456e453c36","Type":"ContainerStarted","Data":"0ca9ef2b76bd5b0d803f8bd188ae0bfbbdaf2b8e244dcdad076b8c75202ff0fe"} Mar 09 10:00:58 crc kubenswrapper[4792]: I0309 10:00:58.102172 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796cbc75d5-fwljj" event={"ID":"2c175458-c03c-4901-a2ec-6250b13a461b","Type":"ContainerStarted","Data":"ad2b61e9dbb0c585ca504a0f7aafd0e8ecf4e3a75a071d7571525e2f98d59727"} Mar 09 10:00:58 crc kubenswrapper[4792]: I0309 10:00:58.104546 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c85f748d-wxdlf" event={"ID":"d028a70e-dd9d-4b38-bb18-4cd55cd002fe","Type":"ContainerStarted","Data":"92773ce142d9a7aac81d081f013a79a757b3e3363aef752f5ef6c12fc36ddecd"} Mar 09 10:00:58 crc kubenswrapper[4792]: I0309 10:00:58.105720 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fd9d548-2q98q" event={"ID":"24426baa-19e0-4ac0-87b6-0f5a824de578","Type":"ContainerStarted","Data":"ad1a1add70b8923f0b7ada06f9c8f1ad552fcf6e3221b1cb992de05b4d3b0fae"} Mar 09 10:00:58 crc kubenswrapper[4792]: I0309 10:00:58.662095 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 10:00:58 crc kubenswrapper[4792]: E0309 10:00:58.662637 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.133387 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2fc0a824-2dfc-436e-ad6e-c0751afcb61f","Type":"ContainerStarted","Data":"aa3e61173e4a8ae4f3ff3813b53fcc6275c1ef63c2492ffbd0ecb5cbc3c7bfc8"} Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.136971 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c","Type":"ContainerStarted","Data":"40b3d4fdb037883927a2fa6a663e951ccc5e889a96a4e9f26722c39536bac1d5"} Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.139637 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f9cb7fbd9-gcq9k" podUID="8646be85-08e4-4f0d-9484-01456e453c36" containerName="horizon-log" containerID="cri-o://0ca9ef2b76bd5b0d803f8bd188ae0bfbbdaf2b8e244dcdad076b8c75202ff0fe" gracePeriod=30 Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.139760 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f9cb7fbd9-gcq9k" podUID="8646be85-08e4-4f0d-9484-01456e453c36" containerName="horizon" containerID="cri-o://5f22cae7f12dd03e4d2a8f6ec1d96f1a332812420158828599913fb3ae705421" gracePeriod=30 Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.140184 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f9cb7fbd9-gcq9k" event={"ID":"8646be85-08e4-4f0d-9484-01456e453c36","Type":"ContainerStarted","Data":"5f22cae7f12dd03e4d2a8f6ec1d96f1a332812420158828599913fb3ae705421"} Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.149173 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796cbc75d5-fwljj" event={"ID":"2c175458-c03c-4901-a2ec-6250b13a461b","Type":"ContainerStarted","Data":"bbe4f241005a7f3226f3d361b3865d351955100e1cd3bcb114704688ca662a53"} Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.149325 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-796cbc75d5-fwljj" podUID="2c175458-c03c-4901-a2ec-6250b13a461b" containerName="horizon-log" containerID="cri-o://ad2b61e9dbb0c585ca504a0f7aafd0e8ecf4e3a75a071d7571525e2f98d59727" gracePeriod=30 Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.149584 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-796cbc75d5-fwljj" podUID="2c175458-c03c-4901-a2ec-6250b13a461b" containerName="horizon" containerID="cri-o://bbe4f241005a7f3226f3d361b3865d351955100e1cd3bcb114704688ca662a53" gracePeriod=30 Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.154814 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c85f748d-wxdlf" event={"ID":"d028a70e-dd9d-4b38-bb18-4cd55cd002fe","Type":"ContainerStarted","Data":"15becaf0b415f66349567ab4eff3f659aefbb9ea1b266bac9e335c827f98d485"} Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.175359 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fd9d548-2q98q" event={"ID":"24426baa-19e0-4ac0-87b6-0f5a824de578","Type":"ContainerStarted","Data":"8db2115c69bab35530668a1e9f86126f8a7d265d2b553c3753f91f58c2a978ce"} Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.176582 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.176555345 podStartE2EDuration="11.176555345s" podCreationTimestamp="2026-03-09 10:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:59.163439822 +0000 UTC m=+3224.193640584" watchObservedRunningTime="2026-03-09 10:00:59.176555345 +0000 UTC m=+3224.206756097" Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.203795 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6f9cb7fbd9-gcq9k" podStartSLOduration=3.003370884 podStartE2EDuration="21.203773138s" podCreationTimestamp="2026-03-09 10:00:38 +0000 UTC" firstStartedPulling="2026-03-09 10:00:39.659998687 +0000 UTC m=+3204.690199439" lastFinishedPulling="2026-03-09 10:00:57.860400941 +0000 UTC m=+3222.890601693" observedRunningTime="2026-03-09 10:00:59.187593066 +0000 UTC m=+3224.217793818" watchObservedRunningTime="2026-03-09 10:00:59.203773138 +0000 UTC m=+3224.233973900" Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.272687 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-796cbc75d5-fwljj" podStartSLOduration=3.240882984 podStartE2EDuration="21.27266357s" podCreationTimestamp="2026-03-09 10:00:38 +0000 UTC" firstStartedPulling="2026-03-09 10:00:39.614430238 +0000 UTC m=+3204.644630990" lastFinishedPulling="2026-03-09 10:00:57.646210824 +0000 UTC m=+3222.676411576" observedRunningTime="2026-03-09 10:00:59.243245522 +0000 UTC m=+3224.273446284" watchObservedRunningTime="2026-03-09 10:00:59.27266357 +0000 UTC m=+3224.302864322" Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.311005 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54c85f748d-wxdlf" podStartSLOduration=3.264146379 podStartE2EDuration="17.310983704s" podCreationTimestamp="2026-03-09 10:00:42 +0000 UTC" firstStartedPulling="2026-03-09 10:00:43.688920396 +0000 UTC m=+3208.719121148" lastFinishedPulling="2026-03-09 10:00:57.735757721 +0000 UTC m=+3222.765958473" observedRunningTime="2026-03-09 10:00:59.266691277 +0000 UTC m=+3224.296892029" watchObservedRunningTime="2026-03-09 10:00:59.310983704 +0000 UTC m=+3224.341184466" Mar 09 10:00:59 crc kubenswrapper[4792]: I0309 10:00:59.335929 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-85fd9d548-2q98q" podStartSLOduration=3.093066238 podStartE2EDuration="17.335904238s" podCreationTimestamp="2026-03-09 10:00:42 +0000 UTC" firstStartedPulling="2026-03-09 10:00:43.46064407 +0000 UTC m=+3208.490844822" lastFinishedPulling="2026-03-09 10:00:57.70348207 +0000 UTC m=+3222.733682822" observedRunningTime="2026-03-09 10:00:59.297560092 +0000 UTC m=+3224.327760864" watchObservedRunningTime="2026-03-09 10:00:59.335904238 +0000 UTC m=+3224.366104990" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.162720 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29550841-fkrc7"] Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.164052 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.200157 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c","Type":"ContainerStarted","Data":"f9442f7c4de7926ce5e30fcd6106d1668fdace75fce90cccd9c573376930e6d4"} Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.206565 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29550841-fkrc7"] Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.233436 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-config-data\") pod \"keystone-cron-29550841-fkrc7\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.233771 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-combined-ca-bundle\") pod \"keystone-cron-29550841-fkrc7\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.233960 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltq44\" (UniqueName: \"kubernetes.io/projected/b83ae2a5-e733-497b-a5de-56d3a962dec5-kube-api-access-ltq44\") pod \"keystone-cron-29550841-fkrc7\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.234146 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-fernet-keys\") pod \"keystone-cron-29550841-fkrc7\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.266557 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.266533717 podStartE2EDuration="12.266533717s" podCreationTimestamp="2026-03-09 10:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:00.245500543 +0000 UTC m=+3225.275701305" watchObservedRunningTime="2026-03-09 10:01:00.266533717 +0000 UTC m=+3225.296734469" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.336855 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-combined-ca-bundle\") pod \"keystone-cron-29550841-fkrc7\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.336932 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltq44\" (UniqueName: \"kubernetes.io/projected/b83ae2a5-e733-497b-a5de-56d3a962dec5-kube-api-access-ltq44\") pod \"keystone-cron-29550841-fkrc7\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.337132 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-fernet-keys\") pod \"keystone-cron-29550841-fkrc7\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.337483 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-config-data\") pod \"keystone-cron-29550841-fkrc7\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.356196 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-fernet-keys\") pod \"keystone-cron-29550841-fkrc7\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.358885 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltq44\" (UniqueName: \"kubernetes.io/projected/b83ae2a5-e733-497b-a5de-56d3a962dec5-kube-api-access-ltq44\") pod \"keystone-cron-29550841-fkrc7\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.362317 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-config-data\") pod \"keystone-cron-29550841-fkrc7\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.378834 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-combined-ca-bundle\") pod \"keystone-cron-29550841-fkrc7\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:00 crc kubenswrapper[4792]: I0309 10:01:00.496671 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:01 crc kubenswrapper[4792]: I0309 10:01:01.072494 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29550841-fkrc7"] Mar 09 10:01:02 crc kubenswrapper[4792]: I0309 10:01:02.662365 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:01:02 crc kubenswrapper[4792]: I0309 10:01:02.662764 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:01:02 crc kubenswrapper[4792]: I0309 10:01:02.825803 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:01:02 crc kubenswrapper[4792]: I0309 10:01:02.825883 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:01:04 crc kubenswrapper[4792]: W0309 10:01:04.597693 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb83ae2a5_e733_497b_a5de_56d3a962dec5.slice/crio-5b516dc2479cf0b0373c803567654823970c0217bb287b92b04d15c7a46584a4 WatchSource:0}: Error finding container 5b516dc2479cf0b0373c803567654823970c0217bb287b92b04d15c7a46584a4: Status 404 returned error can't find the container with id 5b516dc2479cf0b0373c803567654823970c0217bb287b92b04d15c7a46584a4 Mar 09 10:01:05 crc kubenswrapper[4792]: I0309 10:01:05.247572 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550841-fkrc7" event={"ID":"b83ae2a5-e733-497b-a5de-56d3a962dec5","Type":"ContainerStarted","Data":"5b516dc2479cf0b0373c803567654823970c0217bb287b92b04d15c7a46584a4"} Mar 09 10:01:06 crc kubenswrapper[4792]: I0309 10:01:06.257860 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550841-fkrc7" event={"ID":"b83ae2a5-e733-497b-a5de-56d3a962dec5","Type":"ContainerStarted","Data":"382458614bcaef123c26d588dd4ce879b671e5bb3efe6ee99c41a928be441be8"} Mar 09 10:01:06 crc kubenswrapper[4792]: I0309 10:01:06.287601 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29550841-fkrc7" podStartSLOduration=6.287580212 podStartE2EDuration="6.287580212s" podCreationTimestamp="2026-03-09 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:06.281651582 +0000 UTC m=+3231.311852344" watchObservedRunningTime="2026-03-09 10:01:06.287580212 +0000 UTC m=+3231.317780964" Mar 09 10:01:07 crc kubenswrapper[4792]: I0309 10:01:07.274812 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-vll6l" event={"ID":"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6","Type":"ContainerStarted","Data":"8b10739ffff403a67411cdc03c4f0542e1ef6be44b8b7bd8574b945fc5079308"} Mar 09 10:01:07 crc kubenswrapper[4792]: I0309 10:01:07.301580 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-vll6l" podStartSLOduration=10.027765912 podStartE2EDuration="18.301560521s" podCreationTimestamp="2026-03-09 10:00:49 +0000 UTC" firstStartedPulling="2026-03-09 10:00:57.513359995 +0000 UTC m=+3222.543560747" lastFinishedPulling="2026-03-09 10:01:05.787154594 +0000 UTC m=+3230.817355356" observedRunningTime="2026-03-09 10:01:07.289978477 +0000 UTC m=+3232.320179229" watchObservedRunningTime="2026-03-09 10:01:07.301560521 +0000 UTC m=+3232.331761273" Mar 09 10:01:08 crc kubenswrapper[4792]: I0309 10:01:08.769125 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:01:08 crc kubenswrapper[4792]: I0309 10:01:08.930606 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 10:01:08 crc kubenswrapper[4792]: I0309 10:01:08.930665 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 10:01:08 crc kubenswrapper[4792]: I0309 10:01:08.965895 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 10:01:09 crc kubenswrapper[4792]: I0309 10:01:09.008733 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 10:01:09 crc kubenswrapper[4792]: I0309 10:01:09.048566 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:01:09 crc kubenswrapper[4792]: I0309 10:01:09.288836 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 10:01:09 crc kubenswrapper[4792]: I0309 10:01:09.288885 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 10:01:09 crc kubenswrapper[4792]: I0309 10:01:09.457967 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 10:01:09 crc kubenswrapper[4792]: I0309 10:01:09.458023 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 10:01:09 crc kubenswrapper[4792]: I0309 10:01:09.508589 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 10:01:09 crc kubenswrapper[4792]: I0309 10:01:09.512286 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 10:01:10 crc kubenswrapper[4792]: I0309 10:01:10.326552 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 10:01:10 crc kubenswrapper[4792]: I0309 10:01:10.326952 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 10:01:11 crc kubenswrapper[4792]: I0309 10:01:11.332846 4792 generic.go:334] "Generic (PLEG): container finished" podID="b83ae2a5-e733-497b-a5de-56d3a962dec5" containerID="382458614bcaef123c26d588dd4ce879b671e5bb3efe6ee99c41a928be441be8" exitCode=0 Mar 09 10:01:11 crc kubenswrapper[4792]: I0309 10:01:11.334405 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550841-fkrc7" event={"ID":"b83ae2a5-e733-497b-a5de-56d3a962dec5","Type":"ContainerDied","Data":"382458614bcaef123c26d588dd4ce879b671e5bb3efe6ee99c41a928be441be8"} Mar 09 10:01:11 crc kubenswrapper[4792]: I0309 10:01:11.664864 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 10:01:11 crc kubenswrapper[4792]: E0309 10:01:11.665100 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:01:12 crc kubenswrapper[4792]: I0309 10:01:12.341455 4792 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 10:01:12 crc kubenswrapper[4792]: I0309 10:01:12.341489 4792 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 10:01:12 crc kubenswrapper[4792]: I0309 10:01:12.666486 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-85fd9d548-2q98q" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.10:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.10:8443: connect: connection refused" Mar 09 10:01:12 crc kubenswrapper[4792]: I0309 10:01:12.831818 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54c85f748d-wxdlf" podUID="d028a70e-dd9d-4b38-bb18-4cd55cd002fe" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.11:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.11:8443: connect: connection refused" Mar 09 10:01:12 crc kubenswrapper[4792]: I0309 10:01:12.833364 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:12 crc kubenswrapper[4792]: I0309 10:01:12.924231 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltq44\" (UniqueName: \"kubernetes.io/projected/b83ae2a5-e733-497b-a5de-56d3a962dec5-kube-api-access-ltq44\") pod \"b83ae2a5-e733-497b-a5de-56d3a962dec5\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " Mar 09 10:01:12 crc kubenswrapper[4792]: I0309 10:01:12.924316 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-config-data\") pod \"b83ae2a5-e733-497b-a5de-56d3a962dec5\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " Mar 09 10:01:12 crc kubenswrapper[4792]: I0309 10:01:12.924342 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-combined-ca-bundle\") pod \"b83ae2a5-e733-497b-a5de-56d3a962dec5\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " Mar 09 10:01:12 crc kubenswrapper[4792]: I0309 10:01:12.924529 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-fernet-keys\") pod \"b83ae2a5-e733-497b-a5de-56d3a962dec5\" (UID: \"b83ae2a5-e733-497b-a5de-56d3a962dec5\") " Mar 09 10:01:12 crc kubenswrapper[4792]: I0309 10:01:12.938789 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b83ae2a5-e733-497b-a5de-56d3a962dec5" (UID: "b83ae2a5-e733-497b-a5de-56d3a962dec5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:12 crc kubenswrapper[4792]: I0309 10:01:12.945019 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83ae2a5-e733-497b-a5de-56d3a962dec5-kube-api-access-ltq44" (OuterVolumeSpecName: "kube-api-access-ltq44") pod "b83ae2a5-e733-497b-a5de-56d3a962dec5" (UID: "b83ae2a5-e733-497b-a5de-56d3a962dec5"). InnerVolumeSpecName "kube-api-access-ltq44". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:12 crc kubenswrapper[4792]: I0309 10:01:12.979237 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b83ae2a5-e733-497b-a5de-56d3a962dec5" (UID: "b83ae2a5-e733-497b-a5de-56d3a962dec5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.008743 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-config-data" (OuterVolumeSpecName: "config-data") pod "b83ae2a5-e733-497b-a5de-56d3a962dec5" (UID: "b83ae2a5-e733-497b-a5de-56d3a962dec5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.026900 4792 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.026941 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltq44\" (UniqueName: \"kubernetes.io/projected/b83ae2a5-e733-497b-a5de-56d3a962dec5-kube-api-access-ltq44\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.026950 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.026960 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83ae2a5-e733-497b-a5de-56d3a962dec5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.349524 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550841-fkrc7" event={"ID":"b83ae2a5-e733-497b-a5de-56d3a962dec5","Type":"ContainerDied","Data":"5b516dc2479cf0b0373c803567654823970c0217bb287b92b04d15c7a46584a4"} Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.349562 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550841-fkrc7" Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.349568 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b516dc2479cf0b0373c803567654823970c0217bb287b92b04d15c7a46584a4" Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.380569 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.380665 4792 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.384108 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.398402 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.398533 4792 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 10:01:13 crc kubenswrapper[4792]: I0309 10:01:13.400271 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 10:01:22 crc kubenswrapper[4792]: I0309 10:01:22.433086 4792 generic.go:334] "Generic (PLEG): container finished" podID="8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6" containerID="8b10739ffff403a67411cdc03c4f0542e1ef6be44b8b7bd8574b945fc5079308" exitCode=0 Mar 09 10:01:22 crc kubenswrapper[4792]: I0309 10:01:22.433168 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-vll6l" event={"ID":"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6","Type":"ContainerDied","Data":"8b10739ffff403a67411cdc03c4f0542e1ef6be44b8b7bd8574b945fc5079308"} Mar 09 10:01:22 crc kubenswrapper[4792]: I0309 10:01:22.661841 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-85fd9d548-2q98q" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.10:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.10:8443: connect: connection refused" Mar 09 10:01:22 crc kubenswrapper[4792]: I0309 10:01:22.826256 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54c85f748d-wxdlf" podUID="d028a70e-dd9d-4b38-bb18-4cd55cd002fe" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.11:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.11:8443: connect: connection refused" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.138961 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-vll6l" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.279399 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-combined-ca-bundle\") pod \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.279529 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-config-data\") pod \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.279581 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-job-config-data\") pod \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.279604 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkfj5\" (UniqueName: \"kubernetes.io/projected/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-kube-api-access-zkfj5\") pod \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\" (UID: \"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6\") " Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.324476 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-kube-api-access-zkfj5" (OuterVolumeSpecName: "kube-api-access-zkfj5") pod "8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6" (UID: "8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6"). InnerVolumeSpecName "kube-api-access-zkfj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.328330 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-config-data" (OuterVolumeSpecName: "config-data") pod "8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6" (UID: "8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.330993 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6" (UID: "8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.356245 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6" (UID: "8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.382289 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.382532 4792 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.382625 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkfj5\" (UniqueName: \"kubernetes.io/projected/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-kube-api-access-zkfj5\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.382748 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.454359 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-vll6l" event={"ID":"8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6","Type":"ContainerDied","Data":"d0763feda1de40455373b07c413f9938eaf9d4c54d46419cb4c3749270c786df"} Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.454735 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0763feda1de40455373b07c413f9938eaf9d4c54d46419cb4c3749270c786df" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.454711 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-vll6l" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.806397 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 10:01:24 crc kubenswrapper[4792]: E0309 10:01:24.807216 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6" containerName="manila-db-sync" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.807239 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6" containerName="manila-db-sync" Mar 09 10:01:24 crc kubenswrapper[4792]: E0309 10:01:24.807264 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83ae2a5-e733-497b-a5de-56d3a962dec5" containerName="keystone-cron" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.807273 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83ae2a5-e733-497b-a5de-56d3a962dec5" containerName="keystone-cron" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.807507 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6" containerName="manila-db-sync" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.807528 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83ae2a5-e733-497b-a5de-56d3a962dec5" containerName="keystone-cron" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.828479 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.844788 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.845030 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.845728 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.857208 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.857473 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-zfj99" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.939613 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.955236 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 10:01:24 crc kubenswrapper[4792]: I0309 10:01:24.960730 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.003952 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e8c6802-54a3-4aed-9476-3f10a2369eea-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.004080 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nr2h\" (UniqueName: \"kubernetes.io/projected/9e8c6802-54a3-4aed-9476-3f10a2369eea-kube-api-access-4nr2h\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.004104 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.004180 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-config-data\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.004209 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.004226 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-scripts\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.057514 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86c6bdcc4c-q4zw6"] Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.059042 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.084800 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86c6bdcc4c-q4zw6"] Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105443 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dced9726-c642-4e64-a963-9ce584e76e9c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105511 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-config-data\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105533 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/dced9726-c642-4e64-a963-9ce584e76e9c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105561 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105578 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-scripts\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105603 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e8c6802-54a3-4aed-9476-3f10a2369eea-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105651 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-scripts\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105676 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105695 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dced9726-c642-4e64-a963-9ce584e76e9c-ceph\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105710 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105730 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-config-data\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105756 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt9hq\" (UniqueName: \"kubernetes.io/projected/dced9726-c642-4e64-a963-9ce584e76e9c-kube-api-access-tt9hq\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105779 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nr2h\" (UniqueName: \"kubernetes.io/projected/9e8c6802-54a3-4aed-9476-3f10a2369eea-kube-api-access-4nr2h\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.105795 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.107557 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e8c6802-54a3-4aed-9476-3f10a2369eea-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.116050 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.127494 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-scripts\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.131632 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.131799 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.132500 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-config-data\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.166456 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nr2h\" (UniqueName: \"kubernetes.io/projected/9e8c6802-54a3-4aed-9476-3f10a2369eea-kube-api-access-4nr2h\") pod \"manila-scheduler-0\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.187102 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212308 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-config-data\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212365 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-config\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212385 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt9hq\" (UniqueName: \"kubernetes.io/projected/dced9726-c642-4e64-a963-9ce584e76e9c-kube-api-access-tt9hq\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212414 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-openstack-edpm-ipam\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212474 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-dns-svc\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212502 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dced9726-c642-4e64-a963-9ce584e76e9c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212535 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-ovsdbserver-sb\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212554 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-ovsdbserver-nb\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212582 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/dced9726-c642-4e64-a963-9ce584e76e9c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212656 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-scripts\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212683 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212703 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dced9726-c642-4e64-a963-9ce584e76e9c-ceph\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212718 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.212740 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjxg9\" (UniqueName: \"kubernetes.io/projected/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-kube-api-access-sjxg9\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.213723 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dced9726-c642-4e64-a963-9ce584e76e9c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.221707 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-scripts\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.221895 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/dced9726-c642-4e64-a963-9ce584e76e9c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.226131 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dced9726-c642-4e64-a963-9ce584e76e9c-ceph\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.230205 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-config-data\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.230754 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.253775 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.268892 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt9hq\" (UniqueName: \"kubernetes.io/projected/dced9726-c642-4e64-a963-9ce584e76e9c-kube-api-access-tt9hq\") pod \"manila-share-share1-0\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.282320 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.302913 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.382166 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-ovsdbserver-sb\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.382274 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-ovsdbserver-nb\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.385952 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjxg9\" (UniqueName: \"kubernetes.io/projected/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-kube-api-access-sjxg9\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.386236 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-config\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.386435 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-openstack-edpm-ipam\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.386563 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-dns-svc\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.392682 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-config\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.393009 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-ovsdbserver-nb\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.393453 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-openstack-edpm-ipam\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.393536 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-dns-svc\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.402249 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-ovsdbserver-sb\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.416928 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.420566 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.434597 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.501665 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjxg9\" (UniqueName: \"kubernetes.io/projected/f4a082c9-9e44-4d1b-b361-3fe4af72fbe9-kube-api-access-sjxg9\") pod \"dnsmasq-dns-86c6bdcc4c-q4zw6\" (UID: \"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9\") " pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.610670 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-config-data\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.611090 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsqjs\" (UniqueName: \"kubernetes.io/projected/c9e110f7-b4c7-4b9e-b337-7cec15290a73-kube-api-access-fsqjs\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.611188 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-config-data-custom\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.611205 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e110f7-b4c7-4b9e-b337-7cec15290a73-logs\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.611221 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-scripts\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.611455 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.611605 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e110f7-b4c7-4b9e-b337-7cec15290a73-etc-machine-id\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.688616 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.716251 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-config-data\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.716304 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsqjs\" (UniqueName: \"kubernetes.io/projected/c9e110f7-b4c7-4b9e-b337-7cec15290a73-kube-api-access-fsqjs\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.716348 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-config-data-custom\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.716375 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e110f7-b4c7-4b9e-b337-7cec15290a73-logs\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.716391 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-scripts\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.716433 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.716468 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e110f7-b4c7-4b9e-b337-7cec15290a73-etc-machine-id\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.716603 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e110f7-b4c7-4b9e-b337-7cec15290a73-etc-machine-id\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.719051 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e110f7-b4c7-4b9e-b337-7cec15290a73-logs\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.724394 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-config-data-custom\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.755130 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-config-data\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.756974 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-scripts\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.768797 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:25 crc kubenswrapper[4792]: I0309 10:01:25.799896 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsqjs\" (UniqueName: \"kubernetes.io/projected/c9e110f7-b4c7-4b9e-b337-7cec15290a73-kube-api-access-fsqjs\") pod \"manila-api-0\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " pod="openstack/manila-api-0" Mar 09 10:01:26 crc kubenswrapper[4792]: I0309 10:01:26.056234 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 10:01:26 crc kubenswrapper[4792]: I0309 10:01:26.192509 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 10:01:26 crc kubenswrapper[4792]: I0309 10:01:26.455055 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 10:01:26 crc kubenswrapper[4792]: I0309 10:01:26.543709 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86c6bdcc4c-q4zw6"] Mar 09 10:01:26 crc kubenswrapper[4792]: I0309 10:01:26.561631 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"9e8c6802-54a3-4aed-9476-3f10a2369eea","Type":"ContainerStarted","Data":"a4fe9c3d68e2d3b3348c06a6e5ef2b65662d9fd50cbd5ea5fd7ecdfedce66335"} Mar 09 10:01:26 crc kubenswrapper[4792]: I0309 10:01:26.576921 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"dced9726-c642-4e64-a963-9ce584e76e9c","Type":"ContainerStarted","Data":"86e7aff5ad6ce34c2c1683fd7c4aa3574adfda34404d447cc83d6afb3066f023"} Mar 09 10:01:26 crc kubenswrapper[4792]: I0309 10:01:26.667255 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 10:01:26 crc kubenswrapper[4792]: I0309 10:01:26.935212 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 09 10:01:27 crc kubenswrapper[4792]: I0309 10:01:27.648064 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"c9e110f7-b4c7-4b9e-b337-7cec15290a73","Type":"ContainerStarted","Data":"eb2b1449309eff1353a8293244acdecb108d5f61800c7bd90f94bb2cac354b57"} Mar 09 10:01:27 crc kubenswrapper[4792]: I0309 10:01:27.679964 4792 generic.go:334] "Generic (PLEG): container finished" podID="f4a082c9-9e44-4d1b-b361-3fe4af72fbe9" containerID="56f475f836a9d5894a8b238e91a0d8e6d8d27bedd6b9001630beb9960aef9a77" exitCode=0 Mar 09 10:01:27 crc kubenswrapper[4792]: I0309 10:01:27.693428 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" event={"ID":"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9","Type":"ContainerDied","Data":"56f475f836a9d5894a8b238e91a0d8e6d8d27bedd6b9001630beb9960aef9a77"} Mar 09 10:01:27 crc kubenswrapper[4792]: I0309 10:01:27.693478 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" event={"ID":"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9","Type":"ContainerStarted","Data":"fde0d6eec578b68199c563e43813ad9147741d5f63555f15688ac00566088ed1"} Mar 09 10:01:27 crc kubenswrapper[4792]: I0309 10:01:27.721345 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"8b8ee3a4ed67871065ec447eb157a9db2536cee10ff7911e7415e58bb9ff5c63"} Mar 09 10:01:28 crc kubenswrapper[4792]: I0309 10:01:28.841098 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 09 10:01:28 crc kubenswrapper[4792]: I0309 10:01:28.876447 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" event={"ID":"f4a082c9-9e44-4d1b-b361-3fe4af72fbe9","Type":"ContainerStarted","Data":"56ce22f1106ae8c7f9b5097dfb73acc35875ccdc8fd21ebff731ebedd5d57904"} Mar 09 10:01:28 crc kubenswrapper[4792]: I0309 10:01:28.880699 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:28 crc kubenswrapper[4792]: I0309 10:01:28.915240 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" podStartSLOduration=3.915221376 podStartE2EDuration="3.915221376s" podCreationTimestamp="2026-03-09 10:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:28.910541976 +0000 UTC m=+3253.940742728" watchObservedRunningTime="2026-03-09 10:01:28.915221376 +0000 UTC m=+3253.945422128" Mar 09 10:01:28 crc kubenswrapper[4792]: I0309 10:01:28.918731 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"9e8c6802-54a3-4aed-9476-3f10a2369eea","Type":"ContainerStarted","Data":"c27ff3c013f9905435e54bd8dcc244d2c50448448f6724eedcab33feba520041"} Mar 09 10:01:28 crc kubenswrapper[4792]: I0309 10:01:28.922936 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"c9e110f7-b4c7-4b9e-b337-7cec15290a73","Type":"ContainerStarted","Data":"d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192"} Mar 09 10:01:29 crc kubenswrapper[4792]: I0309 10:01:29.997414 4792 generic.go:334] "Generic (PLEG): container finished" podID="8646be85-08e4-4f0d-9484-01456e453c36" containerID="5f22cae7f12dd03e4d2a8f6ec1d96f1a332812420158828599913fb3ae705421" exitCode=137 Mar 09 10:01:29 crc kubenswrapper[4792]: I0309 10:01:29.998023 4792 generic.go:334] "Generic (PLEG): container finished" podID="8646be85-08e4-4f0d-9484-01456e453c36" containerID="0ca9ef2b76bd5b0d803f8bd188ae0bfbbdaf2b8e244dcdad076b8c75202ff0fe" exitCode=137 Mar 09 10:01:29 crc kubenswrapper[4792]: I0309 10:01:29.997496 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f9cb7fbd9-gcq9k" event={"ID":"8646be85-08e4-4f0d-9484-01456e453c36","Type":"ContainerDied","Data":"5f22cae7f12dd03e4d2a8f6ec1d96f1a332812420158828599913fb3ae705421"} Mar 09 10:01:29 crc kubenswrapper[4792]: I0309 10:01:29.998173 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f9cb7fbd9-gcq9k" event={"ID":"8646be85-08e4-4f0d-9484-01456e453c36","Type":"ContainerDied","Data":"0ca9ef2b76bd5b0d803f8bd188ae0bfbbdaf2b8e244dcdad076b8c75202ff0fe"} Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.032864 4792 generic.go:334] "Generic (PLEG): container finished" podID="2c175458-c03c-4901-a2ec-6250b13a461b" containerID="bbe4f241005a7f3226f3d361b3865d351955100e1cd3bcb114704688ca662a53" exitCode=137 Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.032921 4792 generic.go:334] "Generic (PLEG): container finished" podID="2c175458-c03c-4901-a2ec-6250b13a461b" containerID="ad2b61e9dbb0c585ca504a0f7aafd0e8ecf4e3a75a071d7571525e2f98d59727" exitCode=137 Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.033019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796cbc75d5-fwljj" event={"ID":"2c175458-c03c-4901-a2ec-6250b13a461b","Type":"ContainerDied","Data":"bbe4f241005a7f3226f3d361b3865d351955100e1cd3bcb114704688ca662a53"} Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.033120 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796cbc75d5-fwljj" event={"ID":"2c175458-c03c-4901-a2ec-6250b13a461b","Type":"ContainerDied","Data":"ad2b61e9dbb0c585ca504a0f7aafd0e8ecf4e3a75a071d7571525e2f98d59727"} Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.073783 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="c9e110f7-b4c7-4b9e-b337-7cec15290a73" containerName="manila-api-log" containerID="cri-o://d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192" gracePeriod=30 Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.074463 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"c9e110f7-b4c7-4b9e-b337-7cec15290a73","Type":"ContainerStarted","Data":"295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1"} Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.074626 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.074989 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="c9e110f7-b4c7-4b9e-b337-7cec15290a73" containerName="manila-api" containerID="cri-o://295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1" gracePeriod=30 Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.098266 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"9e8c6802-54a3-4aed-9476-3f10a2369eea","Type":"ContainerStarted","Data":"31082dbf96fbfee14b67f6cc5e39ad4fc21613b7aa64a75f6a02c8ba914bfcff"} Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.113740 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=5.113716538 podStartE2EDuration="5.113716538s" podCreationTimestamp="2026-03-09 10:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:30.104282757 +0000 UTC m=+3255.134483529" watchObservedRunningTime="2026-03-09 10:01:30.113716538 +0000 UTC m=+3255.143917300" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.175704 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=5.358782047 podStartE2EDuration="6.175685823s" podCreationTimestamp="2026-03-09 10:01:24 +0000 UTC" firstStartedPulling="2026-03-09 10:01:26.221374403 +0000 UTC m=+3251.251575155" lastFinishedPulling="2026-03-09 10:01:27.038278179 +0000 UTC m=+3252.068478931" observedRunningTime="2026-03-09 10:01:30.146935323 +0000 UTC m=+3255.177136095" watchObservedRunningTime="2026-03-09 10:01:30.175685823 +0000 UTC m=+3255.205886585" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.349512 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.381782 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c175458-c03c-4901-a2ec-6250b13a461b-config-data\") pod \"2c175458-c03c-4901-a2ec-6250b13a461b\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.381915 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c175458-c03c-4901-a2ec-6250b13a461b-logs\") pod \"2c175458-c03c-4901-a2ec-6250b13a461b\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.382010 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c175458-c03c-4901-a2ec-6250b13a461b-scripts\") pod \"2c175458-c03c-4901-a2ec-6250b13a461b\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.382145 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c175458-c03c-4901-a2ec-6250b13a461b-horizon-secret-key\") pod \"2c175458-c03c-4901-a2ec-6250b13a461b\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.382170 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w2xn\" (UniqueName: \"kubernetes.io/projected/2c175458-c03c-4901-a2ec-6250b13a461b-kube-api-access-4w2xn\") pod \"2c175458-c03c-4901-a2ec-6250b13a461b\" (UID: \"2c175458-c03c-4901-a2ec-6250b13a461b\") " Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.382719 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c175458-c03c-4901-a2ec-6250b13a461b-logs" (OuterVolumeSpecName: "logs") pod "2c175458-c03c-4901-a2ec-6250b13a461b" (UID: "2c175458-c03c-4901-a2ec-6250b13a461b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.404284 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c175458-c03c-4901-a2ec-6250b13a461b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2c175458-c03c-4901-a2ec-6250b13a461b" (UID: "2c175458-c03c-4901-a2ec-6250b13a461b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.407296 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c175458-c03c-4901-a2ec-6250b13a461b-kube-api-access-4w2xn" (OuterVolumeSpecName: "kube-api-access-4w2xn") pod "2c175458-c03c-4901-a2ec-6250b13a461b" (UID: "2c175458-c03c-4901-a2ec-6250b13a461b"). InnerVolumeSpecName "kube-api-access-4w2xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.434903 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c175458-c03c-4901-a2ec-6250b13a461b-config-data" (OuterVolumeSpecName: "config-data") pod "2c175458-c03c-4901-a2ec-6250b13a461b" (UID: "2c175458-c03c-4901-a2ec-6250b13a461b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.490509 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c175458-c03c-4901-a2ec-6250b13a461b-logs\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.490550 4792 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c175458-c03c-4901-a2ec-6250b13a461b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.490584 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w2xn\" (UniqueName: \"kubernetes.io/projected/2c175458-c03c-4901-a2ec-6250b13a461b-kube-api-access-4w2xn\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.490593 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c175458-c03c-4901-a2ec-6250b13a461b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.495199 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c175458-c03c-4901-a2ec-6250b13a461b-scripts" (OuterVolumeSpecName: "scripts") pod "2c175458-c03c-4901-a2ec-6250b13a461b" (UID: "2c175458-c03c-4901-a2ec-6250b13a461b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.600169 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c175458-c03c-4901-a2ec-6250b13a461b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.739389 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.805657 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8646be85-08e4-4f0d-9484-01456e453c36-scripts\") pod \"8646be85-08e4-4f0d-9484-01456e453c36\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.805800 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8646be85-08e4-4f0d-9484-01456e453c36-config-data\") pod \"8646be85-08e4-4f0d-9484-01456e453c36\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.805914 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zcxv\" (UniqueName: \"kubernetes.io/projected/8646be85-08e4-4f0d-9484-01456e453c36-kube-api-access-9zcxv\") pod \"8646be85-08e4-4f0d-9484-01456e453c36\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.805955 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8646be85-08e4-4f0d-9484-01456e453c36-logs\") pod \"8646be85-08e4-4f0d-9484-01456e453c36\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.806026 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8646be85-08e4-4f0d-9484-01456e453c36-horizon-secret-key\") pod \"8646be85-08e4-4f0d-9484-01456e453c36\" (UID: \"8646be85-08e4-4f0d-9484-01456e453c36\") " Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.810997 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8646be85-08e4-4f0d-9484-01456e453c36-logs" (OuterVolumeSpecName: "logs") pod "8646be85-08e4-4f0d-9484-01456e453c36" (UID: "8646be85-08e4-4f0d-9484-01456e453c36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.811639 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8646be85-08e4-4f0d-9484-01456e453c36-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8646be85-08e4-4f0d-9484-01456e453c36" (UID: "8646be85-08e4-4f0d-9484-01456e453c36"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.813807 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8646be85-08e4-4f0d-9484-01456e453c36-kube-api-access-9zcxv" (OuterVolumeSpecName: "kube-api-access-9zcxv") pod "8646be85-08e4-4f0d-9484-01456e453c36" (UID: "8646be85-08e4-4f0d-9484-01456e453c36"). InnerVolumeSpecName "kube-api-access-9zcxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.887712 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8646be85-08e4-4f0d-9484-01456e453c36-scripts" (OuterVolumeSpecName: "scripts") pod "8646be85-08e4-4f0d-9484-01456e453c36" (UID: "8646be85-08e4-4f0d-9484-01456e453c36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.888635 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8646be85-08e4-4f0d-9484-01456e453c36-config-data" (OuterVolumeSpecName: "config-data") pod "8646be85-08e4-4f0d-9484-01456e453c36" (UID: "8646be85-08e4-4f0d-9484-01456e453c36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.909401 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8646be85-08e4-4f0d-9484-01456e453c36-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.909434 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8646be85-08e4-4f0d-9484-01456e453c36-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.909447 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zcxv\" (UniqueName: \"kubernetes.io/projected/8646be85-08e4-4f0d-9484-01456e453c36-kube-api-access-9zcxv\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.909459 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8646be85-08e4-4f0d-9484-01456e453c36-logs\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.909468 4792 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8646be85-08e4-4f0d-9484-01456e453c36-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:30 crc kubenswrapper[4792]: I0309 10:01:30.986847 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.113247 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f9cb7fbd9-gcq9k" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.113250 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f9cb7fbd9-gcq9k" event={"ID":"8646be85-08e4-4f0d-9484-01456e453c36","Type":"ContainerDied","Data":"6479a41ee22599d34c84e45233ec2770b8e4a73be9b7a0d768470c49371cc442"} Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.113301 4792 scope.go:117] "RemoveContainer" containerID="5f22cae7f12dd03e4d2a8f6ec1d96f1a332812420158828599913fb3ae705421" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.116047 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-scripts\") pod \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.116117 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e110f7-b4c7-4b9e-b337-7cec15290a73-etc-machine-id\") pod \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.116190 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-config-data\") pod \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.116207 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsqjs\" (UniqueName: \"kubernetes.io/projected/c9e110f7-b4c7-4b9e-b337-7cec15290a73-kube-api-access-fsqjs\") pod \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.116221 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9e110f7-b4c7-4b9e-b337-7cec15290a73-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c9e110f7-b4c7-4b9e-b337-7cec15290a73" (UID: "c9e110f7-b4c7-4b9e-b337-7cec15290a73"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.116348 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e110f7-b4c7-4b9e-b337-7cec15290a73-logs\") pod \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.116404 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-config-data-custom\") pod \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.116436 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-combined-ca-bundle\") pod \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\" (UID: \"c9e110f7-b4c7-4b9e-b337-7cec15290a73\") " Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.116832 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9e110f7-b4c7-4b9e-b337-7cec15290a73-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.117289 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e110f7-b4c7-4b9e-b337-7cec15290a73-logs" (OuterVolumeSpecName: "logs") pod "c9e110f7-b4c7-4b9e-b337-7cec15290a73" (UID: "c9e110f7-b4c7-4b9e-b337-7cec15290a73"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.125522 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-scripts" (OuterVolumeSpecName: "scripts") pod "c9e110f7-b4c7-4b9e-b337-7cec15290a73" (UID: "c9e110f7-b4c7-4b9e-b337-7cec15290a73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.129902 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-796cbc75d5-fwljj" event={"ID":"2c175458-c03c-4901-a2ec-6250b13a461b","Type":"ContainerDied","Data":"3ef1a7af4eccbbad21e71fd0d5613264137a65de7258212f846d7dea9f5a727c"} Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.130016 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-796cbc75d5-fwljj" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.130572 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e110f7-b4c7-4b9e-b337-7cec15290a73-kube-api-access-fsqjs" (OuterVolumeSpecName: "kube-api-access-fsqjs") pod "c9e110f7-b4c7-4b9e-b337-7cec15290a73" (UID: "c9e110f7-b4c7-4b9e-b337-7cec15290a73"). InnerVolumeSpecName "kube-api-access-fsqjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.137230 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c9e110f7-b4c7-4b9e-b337-7cec15290a73" (UID: "c9e110f7-b4c7-4b9e-b337-7cec15290a73"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.141164 4792 generic.go:334] "Generic (PLEG): container finished" podID="c9e110f7-b4c7-4b9e-b337-7cec15290a73" containerID="295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1" exitCode=143 Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.141360 4792 generic.go:334] "Generic (PLEG): container finished" podID="c9e110f7-b4c7-4b9e-b337-7cec15290a73" containerID="d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192" exitCode=143 Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.142566 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.142768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"c9e110f7-b4c7-4b9e-b337-7cec15290a73","Type":"ContainerDied","Data":"295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1"} Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.142915 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"c9e110f7-b4c7-4b9e-b337-7cec15290a73","Type":"ContainerDied","Data":"d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192"} Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.143154 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"c9e110f7-b4c7-4b9e-b337-7cec15290a73","Type":"ContainerDied","Data":"eb2b1449309eff1353a8293244acdecb108d5f61800c7bd90f94bb2cac354b57"} Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.219001 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f9cb7fbd9-gcq9k"] Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.220193 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsqjs\" (UniqueName: \"kubernetes.io/projected/c9e110f7-b4c7-4b9e-b337-7cec15290a73-kube-api-access-fsqjs\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.220221 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e110f7-b4c7-4b9e-b337-7cec15290a73-logs\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.220234 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.220245 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.233682 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-config-data" (OuterVolumeSpecName: "config-data") pod "c9e110f7-b4c7-4b9e-b337-7cec15290a73" (UID: "c9e110f7-b4c7-4b9e-b337-7cec15290a73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.246214 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f9cb7fbd9-gcq9k"] Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.274239 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9e110f7-b4c7-4b9e-b337-7cec15290a73" (UID: "c9e110f7-b4c7-4b9e-b337-7cec15290a73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.277332 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-796cbc75d5-fwljj"] Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.310560 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-796cbc75d5-fwljj"] Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.325591 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.325625 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e110f7-b4c7-4b9e-b337-7cec15290a73-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.455286 4792 scope.go:117] "RemoveContainer" containerID="0ca9ef2b76bd5b0d803f8bd188ae0bfbbdaf2b8e244dcdad076b8c75202ff0fe" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.546026 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.558999 4792 scope.go:117] "RemoveContainer" containerID="bbe4f241005a7f3226f3d361b3865d351955100e1cd3bcb114704688ca662a53" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.591617 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.650362 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 09 10:01:31 crc kubenswrapper[4792]: E0309 10:01:31.652774 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8646be85-08e4-4f0d-9484-01456e453c36" containerName="horizon" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.652800 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8646be85-08e4-4f0d-9484-01456e453c36" containerName="horizon" Mar 09 10:01:31 crc kubenswrapper[4792]: E0309 10:01:31.662719 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c175458-c03c-4901-a2ec-6250b13a461b" containerName="horizon" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.662872 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c175458-c03c-4901-a2ec-6250b13a461b" containerName="horizon" Mar 09 10:01:31 crc kubenswrapper[4792]: E0309 10:01:31.662943 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8646be85-08e4-4f0d-9484-01456e453c36" containerName="horizon-log" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.662953 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8646be85-08e4-4f0d-9484-01456e453c36" containerName="horizon-log" Mar 09 10:01:31 crc kubenswrapper[4792]: E0309 10:01:31.662989 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e110f7-b4c7-4b9e-b337-7cec15290a73" containerName="manila-api-log" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.662998 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e110f7-b4c7-4b9e-b337-7cec15290a73" containerName="manila-api-log" Mar 09 10:01:31 crc kubenswrapper[4792]: E0309 10:01:31.663022 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e110f7-b4c7-4b9e-b337-7cec15290a73" containerName="manila-api" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.663038 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e110f7-b4c7-4b9e-b337-7cec15290a73" containerName="manila-api" Mar 09 10:01:31 crc kubenswrapper[4792]: E0309 10:01:31.663096 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c175458-c03c-4901-a2ec-6250b13a461b" containerName="horizon-log" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.663106 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c175458-c03c-4901-a2ec-6250b13a461b" containerName="horizon-log" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.663880 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8646be85-08e4-4f0d-9484-01456e453c36" containerName="horizon-log" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.663914 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e110f7-b4c7-4b9e-b337-7cec15290a73" containerName="manila-api" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.663934 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c175458-c03c-4901-a2ec-6250b13a461b" containerName="horizon" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.663958 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c175458-c03c-4901-a2ec-6250b13a461b" containerName="horizon-log" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.663981 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e110f7-b4c7-4b9e-b337-7cec15290a73" containerName="manila-api-log" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.664006 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8646be85-08e4-4f0d-9484-01456e453c36" containerName="horizon" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.671906 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.676504 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.676871 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.695828 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.748793 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c175458-c03c-4901-a2ec-6250b13a461b" path="/var/lib/kubelet/pods/2c175458-c03c-4901-a2ec-6250b13a461b/volumes" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.749959 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8646be85-08e4-4f0d-9484-01456e453c36" path="/var/lib/kubelet/pods/8646be85-08e4-4f0d-9484-01456e453c36/volumes" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.755340 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-config-data-custom\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.755537 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq4f4\" (UniqueName: \"kubernetes.io/projected/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-kube-api-access-zq4f4\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.755561 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-etc-machine-id\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.755603 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.755639 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-public-tls-certs\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.755679 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-logs\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.755718 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-scripts\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.755750 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-config-data\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.755809 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.756457 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e110f7-b4c7-4b9e-b337-7cec15290a73" path="/var/lib/kubelet/pods/c9e110f7-b4c7-4b9e-b337-7cec15290a73/volumes" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.757328 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.862090 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq4f4\" (UniqueName: \"kubernetes.io/projected/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-kube-api-access-zq4f4\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.862194 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-etc-machine-id\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.862333 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.862435 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-public-tls-certs\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.862591 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-logs\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.862905 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-scripts\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.863471 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-config-data\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.863645 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.863787 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-config-data-custom\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.865193 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-etc-machine-id\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.865790 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-logs\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.872907 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.875321 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-scripts\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.875768 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-public-tls-certs\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.876299 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.876692 4792 scope.go:117] "RemoveContainer" containerID="ad2b61e9dbb0c585ca504a0f7aafd0e8ecf4e3a75a071d7571525e2f98d59727" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.880869 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-config-data-custom\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.889251 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-config-data\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.912759 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq4f4\" (UniqueName: \"kubernetes.io/projected/74c28e16-49a6-429a-9a95-ae4a07e9cb8e-kube-api-access-zq4f4\") pod \"manila-api-0\" (UID: \"74c28e16-49a6-429a-9a95-ae4a07e9cb8e\") " pod="openstack/manila-api-0" Mar 09 10:01:31 crc kubenswrapper[4792]: E0309 10:01:31.957615 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9e110f7_b4c7_4b9e_b337_7cec15290a73.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9e110f7_b4c7_4b9e_b337_7cec15290a73.slice/crio-eb2b1449309eff1353a8293244acdecb108d5f61800c7bd90f94bb2cac354b57\": RecentStats: unable to find data in memory cache]" Mar 09 10:01:31 crc kubenswrapper[4792]: I0309 10:01:31.997346 4792 scope.go:117] "RemoveContainer" containerID="295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.029296 4792 scope.go:117] "RemoveContainer" containerID="d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.048696 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.151232 4792 scope.go:117] "RemoveContainer" containerID="295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1" Mar 09 10:01:32 crc kubenswrapper[4792]: E0309 10:01:32.151667 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1\": container with ID starting with 295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1 not found: ID does not exist" containerID="295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.151900 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1"} err="failed to get container status \"295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1\": rpc error: code = NotFound desc = could not find container \"295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1\": container with ID starting with 295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1 not found: ID does not exist" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.151919 4792 scope.go:117] "RemoveContainer" containerID="d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192" Mar 09 10:01:32 crc kubenswrapper[4792]: E0309 10:01:32.152132 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192\": container with ID starting with d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192 not found: ID does not exist" containerID="d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.152163 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192"} err="failed to get container status \"d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192\": rpc error: code = NotFound desc = could not find container \"d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192\": container with ID starting with d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192 not found: ID does not exist" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.152175 4792 scope.go:117] "RemoveContainer" containerID="295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.152408 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1"} err="failed to get container status \"295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1\": rpc error: code = NotFound desc = could not find container \"295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1\": container with ID starting with 295e693638b6f2b58d0a8524a2fe89e8d4ab00743f81738a03b41d6e0799b7c1 not found: ID does not exist" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.152422 4792 scope.go:117] "RemoveContainer" containerID="d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.152701 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192"} err="failed to get container status \"d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192\": rpc error: code = NotFound desc = could not find container \"d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192\": container with ID starting with d70be05e2e8c6ac11285fb80acddb927606ec0e0ab5a98556c24001665691192 not found: ID does not exist" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.663439 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-85fd9d548-2q98q" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.10:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.10:8443: connect: connection refused" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.663936 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.667131 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"8db2115c69bab35530668a1e9f86126f8a7d265d2b553c3753f91f58c2a978ce"} pod="openstack/horizon-85fd9d548-2q98q" containerMessage="Container horizon failed startup probe, will be restarted" Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.667219 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-85fd9d548-2q98q" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon" containerID="cri-o://8db2115c69bab35530668a1e9f86126f8a7d265d2b553c3753f91f58c2a978ce" gracePeriod=30 Mar 09 10:01:32 crc kubenswrapper[4792]: I0309 10:01:32.817708 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 09 10:01:32 crc kubenswrapper[4792]: W0309 10:01:32.889685 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74c28e16_49a6_429a_9a95_ae4a07e9cb8e.slice/crio-37d9bd9773e3690a0906e6f2e35474f631a376c1037026e7230ce247a4055798 WatchSource:0}: Error finding container 37d9bd9773e3690a0906e6f2e35474f631a376c1037026e7230ce247a4055798: Status 404 returned error can't find the container with id 37d9bd9773e3690a0906e6f2e35474f631a376c1037026e7230ce247a4055798 Mar 09 10:01:33 crc kubenswrapper[4792]: I0309 10:01:33.225437 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"74c28e16-49a6-429a-9a95-ae4a07e9cb8e","Type":"ContainerStarted","Data":"37d9bd9773e3690a0906e6f2e35474f631a376c1037026e7230ce247a4055798"} Mar 09 10:01:34 crc kubenswrapper[4792]: I0309 10:01:34.253129 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"74c28e16-49a6-429a-9a95-ae4a07e9cb8e","Type":"ContainerStarted","Data":"34e012f282e3c66f7b1fdad5a4681299ef51d19661b89f4a22b5326b1dddc210"} Mar 09 10:01:35 crc kubenswrapper[4792]: I0309 10:01:35.188142 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 09 10:01:35 crc kubenswrapper[4792]: I0309 10:01:35.276826 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"74c28e16-49a6-429a-9a95-ae4a07e9cb8e","Type":"ContainerStarted","Data":"17886d3f4c78f2d7f0a7c601234b83a1cac30991252663cfe14ab0eddeefa9a2"} Mar 09 10:01:35 crc kubenswrapper[4792]: I0309 10:01:35.278005 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 09 10:01:35 crc kubenswrapper[4792]: I0309 10:01:35.344229 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.344204835 podStartE2EDuration="4.344204835s" podCreationTimestamp="2026-03-09 10:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:35.326136316 +0000 UTC m=+3260.356337068" watchObservedRunningTime="2026-03-09 10:01:35.344204835 +0000 UTC m=+3260.374405587" Mar 09 10:01:35 crc kubenswrapper[4792]: I0309 10:01:35.692412 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86c6bdcc4c-q4zw6" Mar 09 10:01:35 crc kubenswrapper[4792]: I0309 10:01:35.841802 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-5jcjx"] Mar 09 10:01:35 crc kubenswrapper[4792]: I0309 10:01:35.842257 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb7494899-5jcjx" podUID="de676155-c813-4600-8042-5434607f00ca" containerName="dnsmasq-dns" containerID="cri-o://8049d4eea6b314205e7cf5baaa8bdb7bb7b5c033661c855a3fdfed6a78772f9b" gracePeriod=10 Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.308811 4792 generic.go:334] "Generic (PLEG): container finished" podID="de676155-c813-4600-8042-5434607f00ca" containerID="8049d4eea6b314205e7cf5baaa8bdb7bb7b5c033661c855a3fdfed6a78772f9b" exitCode=0 Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.312335 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-5jcjx" event={"ID":"de676155-c813-4600-8042-5434607f00ca","Type":"ContainerDied","Data":"8049d4eea6b314205e7cf5baaa8bdb7bb7b5c033661c855a3fdfed6a78772f9b"} Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.611267 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.645346 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p26fs\" (UniqueName: \"kubernetes.io/projected/de676155-c813-4600-8042-5434607f00ca-kube-api-access-p26fs\") pod \"de676155-c813-4600-8042-5434607f00ca\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.645469 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-config\") pod \"de676155-c813-4600-8042-5434607f00ca\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.645511 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-ovsdbserver-sb\") pod \"de676155-c813-4600-8042-5434607f00ca\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.645579 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-ovsdbserver-nb\") pod \"de676155-c813-4600-8042-5434607f00ca\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.645697 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-dns-svc\") pod \"de676155-c813-4600-8042-5434607f00ca\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.645738 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-openstack-edpm-ipam\") pod \"de676155-c813-4600-8042-5434607f00ca\" (UID: \"de676155-c813-4600-8042-5434607f00ca\") " Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.685771 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de676155-c813-4600-8042-5434607f00ca-kube-api-access-p26fs" (OuterVolumeSpecName: "kube-api-access-p26fs") pod "de676155-c813-4600-8042-5434607f00ca" (UID: "de676155-c813-4600-8042-5434607f00ca"). InnerVolumeSpecName "kube-api-access-p26fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.752000 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p26fs\" (UniqueName: \"kubernetes.io/projected/de676155-c813-4600-8042-5434607f00ca-kube-api-access-p26fs\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.783912 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "de676155-c813-4600-8042-5434607f00ca" (UID: "de676155-c813-4600-8042-5434607f00ca"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.796804 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-config" (OuterVolumeSpecName: "config") pod "de676155-c813-4600-8042-5434607f00ca" (UID: "de676155-c813-4600-8042-5434607f00ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.814894 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "de676155-c813-4600-8042-5434607f00ca" (UID: "de676155-c813-4600-8042-5434607f00ca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.820282 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "de676155-c813-4600-8042-5434607f00ca" (UID: "de676155-c813-4600-8042-5434607f00ca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.838438 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de676155-c813-4600-8042-5434607f00ca" (UID: "de676155-c813-4600-8042-5434607f00ca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.858422 4792 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-config\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.858456 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.858469 4792 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.858481 4792 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:36 crc kubenswrapper[4792]: I0309 10:01:36.858492 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/de676155-c813-4600-8042-5434607f00ca-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:37 crc kubenswrapper[4792]: I0309 10:01:37.332085 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7494899-5jcjx" event={"ID":"de676155-c813-4600-8042-5434607f00ca","Type":"ContainerDied","Data":"f4d91a22bd6bedddb254e3c826ac6b571669012cbda350d2269756d3746eca46"} Mar 09 10:01:37 crc kubenswrapper[4792]: I0309 10:01:37.332142 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7494899-5jcjx" Mar 09 10:01:37 crc kubenswrapper[4792]: I0309 10:01:37.332415 4792 scope.go:117] "RemoveContainer" containerID="8049d4eea6b314205e7cf5baaa8bdb7bb7b5c033661c855a3fdfed6a78772f9b" Mar 09 10:01:37 crc kubenswrapper[4792]: I0309 10:01:37.381362 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-5jcjx"] Mar 09 10:01:37 crc kubenswrapper[4792]: I0309 10:01:37.381501 4792 scope.go:117] "RemoveContainer" containerID="2f4574d13e7179836e61569ee9d1a7181cb77d1f328d748bdfcf7533c9bd7e7b" Mar 09 10:01:37 crc kubenswrapper[4792]: I0309 10:01:37.393972 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb7494899-5jcjx"] Mar 09 10:01:37 crc kubenswrapper[4792]: I0309 10:01:37.677164 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de676155-c813-4600-8042-5434607f00ca" path="/var/lib/kubelet/pods/de676155-c813-4600-8042-5434607f00ca/volumes" Mar 09 10:01:37 crc kubenswrapper[4792]: I0309 10:01:37.832276 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54c85f748d-wxdlf" podUID="d028a70e-dd9d-4b38-bb18-4cd55cd002fe" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.11:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 10:01:37 crc kubenswrapper[4792]: I0309 10:01:37.832357 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:01:37 crc kubenswrapper[4792]: I0309 10:01:37.833319 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"15becaf0b415f66349567ab4eff3f659aefbb9ea1b266bac9e335c827f98d485"} pod="openstack/horizon-54c85f748d-wxdlf" containerMessage="Container horizon failed startup probe, will be restarted" Mar 09 10:01:37 crc kubenswrapper[4792]: I0309 10:01:37.833360 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54c85f748d-wxdlf" podUID="d028a70e-dd9d-4b38-bb18-4cd55cd002fe" containerName="horizon" containerID="cri-o://15becaf0b415f66349567ab4eff3f659aefbb9ea1b266bac9e335c827f98d485" gracePeriod=30 Mar 09 10:01:40 crc kubenswrapper[4792]: I0309 10:01:40.465684 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 10:01:40 crc kubenswrapper[4792]: I0309 10:01:40.469534 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="ceilometer-central-agent" containerID="cri-o://6e74c5c0aec7e82f060e96a145c34bc8be6b4ab436a35ce663ceb75d398e1ebf" gracePeriod=30 Mar 09 10:01:40 crc kubenswrapper[4792]: I0309 10:01:40.469567 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="sg-core" containerID="cri-o://3fcd2b98b5ede8c002bfce6e124d3c62e4a6453f75f4bc72768961e341435762" gracePeriod=30 Mar 09 10:01:40 crc kubenswrapper[4792]: I0309 10:01:40.469636 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="proxy-httpd" containerID="cri-o://8763e2c49f7c42e53fd1e133698ee26e3b5d1639228093ea9e69c39d4586bea9" gracePeriod=30 Mar 09 10:01:40 crc kubenswrapper[4792]: I0309 10:01:40.469670 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="ceilometer-notification-agent" containerID="cri-o://29cce57973597b1af8bf3d7c6a08ebd6edea7c673c7333774ec96f629fec0da9" gracePeriod=30 Mar 09 10:01:41 crc kubenswrapper[4792]: I0309 10:01:41.408967 4792 generic.go:334] "Generic (PLEG): container finished" podID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerID="8763e2c49f7c42e53fd1e133698ee26e3b5d1639228093ea9e69c39d4586bea9" exitCode=0 Mar 09 10:01:41 crc kubenswrapper[4792]: I0309 10:01:41.409387 4792 generic.go:334] "Generic (PLEG): container finished" podID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerID="3fcd2b98b5ede8c002bfce6e124d3c62e4a6453f75f4bc72768961e341435762" exitCode=2 Mar 09 10:01:41 crc kubenswrapper[4792]: I0309 10:01:41.409402 4792 generic.go:334] "Generic (PLEG): container finished" podID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerID="29cce57973597b1af8bf3d7c6a08ebd6edea7c673c7333774ec96f629fec0da9" exitCode=0 Mar 09 10:01:41 crc kubenswrapper[4792]: I0309 10:01:41.409410 4792 generic.go:334] "Generic (PLEG): container finished" podID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerID="6e74c5c0aec7e82f060e96a145c34bc8be6b4ab436a35ce663ceb75d398e1ebf" exitCode=0 Mar 09 10:01:41 crc kubenswrapper[4792]: I0309 10:01:41.409433 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50893660-98fd-4a26-a00a-2fcc6a2c3e51","Type":"ContainerDied","Data":"8763e2c49f7c42e53fd1e133698ee26e3b5d1639228093ea9e69c39d4586bea9"} Mar 09 10:01:41 crc kubenswrapper[4792]: I0309 10:01:41.409463 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50893660-98fd-4a26-a00a-2fcc6a2c3e51","Type":"ContainerDied","Data":"3fcd2b98b5ede8c002bfce6e124d3c62e4a6453f75f4bc72768961e341435762"} Mar 09 10:01:41 crc kubenswrapper[4792]: I0309 10:01:41.409475 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50893660-98fd-4a26-a00a-2fcc6a2c3e51","Type":"ContainerDied","Data":"29cce57973597b1af8bf3d7c6a08ebd6edea7c673c7333774ec96f629fec0da9"} Mar 09 10:01:41 crc kubenswrapper[4792]: I0309 10:01:41.409486 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50893660-98fd-4a26-a00a-2fcc6a2c3e51","Type":"ContainerDied","Data":"6e74c5c0aec7e82f060e96a145c34bc8be6b4ab436a35ce663ceb75d398e1ebf"} Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.124774 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.295295 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50893660-98fd-4a26-a00a-2fcc6a2c3e51-run-httpd\") pod \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.295394 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-combined-ca-bundle\") pod \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.295442 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50893660-98fd-4a26-a00a-2fcc6a2c3e51-log-httpd\") pod \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.295470 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-config-data\") pod \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.295504 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-sg-core-conf-yaml\") pod \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.295558 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-scripts\") pod \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.295609 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqcqb\" (UniqueName: \"kubernetes.io/projected/50893660-98fd-4a26-a00a-2fcc6a2c3e51-kube-api-access-gqcqb\") pod \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.295640 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-ceilometer-tls-certs\") pod \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\" (UID: \"50893660-98fd-4a26-a00a-2fcc6a2c3e51\") " Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.297388 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50893660-98fd-4a26-a00a-2fcc6a2c3e51-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "50893660-98fd-4a26-a00a-2fcc6a2c3e51" (UID: "50893660-98fd-4a26-a00a-2fcc6a2c3e51"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.297559 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50893660-98fd-4a26-a00a-2fcc6a2c3e51-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "50893660-98fd-4a26-a00a-2fcc6a2c3e51" (UID: "50893660-98fd-4a26-a00a-2fcc6a2c3e51"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.327214 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-scripts" (OuterVolumeSpecName: "scripts") pod "50893660-98fd-4a26-a00a-2fcc6a2c3e51" (UID: "50893660-98fd-4a26-a00a-2fcc6a2c3e51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.326432 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50893660-98fd-4a26-a00a-2fcc6a2c3e51-kube-api-access-gqcqb" (OuterVolumeSpecName: "kube-api-access-gqcqb") pod "50893660-98fd-4a26-a00a-2fcc6a2c3e51" (UID: "50893660-98fd-4a26-a00a-2fcc6a2c3e51"). InnerVolumeSpecName "kube-api-access-gqcqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.397164 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.397196 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqcqb\" (UniqueName: \"kubernetes.io/projected/50893660-98fd-4a26-a00a-2fcc6a2c3e51-kube-api-access-gqcqb\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.397205 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50893660-98fd-4a26-a00a-2fcc6a2c3e51-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.397227 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50893660-98fd-4a26-a00a-2fcc6a2c3e51-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.414364 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "50893660-98fd-4a26-a00a-2fcc6a2c3e51" (UID: "50893660-98fd-4a26-a00a-2fcc6a2c3e51"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.452858 4792 generic.go:334] "Generic (PLEG): container finished" podID="d028a70e-dd9d-4b38-bb18-4cd55cd002fe" containerID="15becaf0b415f66349567ab4eff3f659aefbb9ea1b266bac9e335c827f98d485" exitCode=0 Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.452937 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c85f748d-wxdlf" event={"ID":"d028a70e-dd9d-4b38-bb18-4cd55cd002fe","Type":"ContainerDied","Data":"15becaf0b415f66349567ab4eff3f659aefbb9ea1b266bac9e335c827f98d485"} Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.452983 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c85f748d-wxdlf" event={"ID":"d028a70e-dd9d-4b38-bb18-4cd55cd002fe","Type":"ContainerStarted","Data":"6cd84ef972037982eaac597357fb2fb6c9e9e1a3ce3348c65e264eaafe049fd2"} Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.463100 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50893660-98fd-4a26-a00a-2fcc6a2c3e51","Type":"ContainerDied","Data":"7edefcb855eb5be8f6395680293370f33f797f2898f6b1e906abec4791f90590"} Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.463155 4792 scope.go:117] "RemoveContainer" containerID="8763e2c49f7c42e53fd1e133698ee26e3b5d1639228093ea9e69c39d4586bea9" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.463319 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.463374 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "50893660-98fd-4a26-a00a-2fcc6a2c3e51" (UID: "50893660-98fd-4a26-a00a-2fcc6a2c3e51"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.499470 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.499795 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.507229 4792 scope.go:117] "RemoveContainer" containerID="3fcd2b98b5ede8c002bfce6e124d3c62e4a6453f75f4bc72768961e341435762" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.523349 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-config-data" (OuterVolumeSpecName: "config-data") pod "50893660-98fd-4a26-a00a-2fcc6a2c3e51" (UID: "50893660-98fd-4a26-a00a-2fcc6a2c3e51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.526389 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50893660-98fd-4a26-a00a-2fcc6a2c3e51" (UID: "50893660-98fd-4a26-a00a-2fcc6a2c3e51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.554499 4792 scope.go:117] "RemoveContainer" containerID="29cce57973597b1af8bf3d7c6a08ebd6edea7c673c7333774ec96f629fec0da9" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.601244 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.601265 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50893660-98fd-4a26-a00a-2fcc6a2c3e51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.667705 4792 scope.go:117] "RemoveContainer" containerID="6e74c5c0aec7e82f060e96a145c34bc8be6b4ab436a35ce663ceb75d398e1ebf" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.861990 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.879567 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.905464 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 10:01:43 crc kubenswrapper[4792]: E0309 10:01:43.906153 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="ceilometer-central-agent" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.906174 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="ceilometer-central-agent" Mar 09 10:01:43 crc kubenswrapper[4792]: E0309 10:01:43.906187 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="sg-core" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.906194 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="sg-core" Mar 09 10:01:43 crc kubenswrapper[4792]: E0309 10:01:43.906204 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de676155-c813-4600-8042-5434607f00ca" containerName="init" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.906211 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="de676155-c813-4600-8042-5434607f00ca" containerName="init" Mar 09 10:01:43 crc kubenswrapper[4792]: E0309 10:01:43.906229 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="ceilometer-notification-agent" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.906237 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="ceilometer-notification-agent" Mar 09 10:01:43 crc kubenswrapper[4792]: E0309 10:01:43.906250 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de676155-c813-4600-8042-5434607f00ca" containerName="dnsmasq-dns" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.906256 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="de676155-c813-4600-8042-5434607f00ca" containerName="dnsmasq-dns" Mar 09 10:01:43 crc kubenswrapper[4792]: E0309 10:01:43.906276 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="proxy-httpd" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.906282 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="proxy-httpd" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.906488 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="ceilometer-notification-agent" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.906508 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="sg-core" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.906524 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="ceilometer-central-agent" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.906543 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="de676155-c813-4600-8042-5434607f00ca" containerName="dnsmasq-dns" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.906558 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" containerName="proxy-httpd" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.908475 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.912981 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.913211 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.913322 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 10:01:43 crc kubenswrapper[4792]: I0309 10:01:43.929580 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.110649 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.110762 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-log-httpd\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.111045 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-run-httpd\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.111115 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.111249 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.111289 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq5pl\" (UniqueName: \"kubernetes.io/projected/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-kube-api-access-rq5pl\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.111327 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-scripts\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.111401 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-config-data\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.212926 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.212988 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq5pl\" (UniqueName: \"kubernetes.io/projected/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-kube-api-access-rq5pl\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.213019 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-scripts\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.213058 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-config-data\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.213125 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.213166 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-log-httpd\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.213239 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-run-httpd\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.213262 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.214598 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-log-httpd\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.214640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-run-httpd\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.219450 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-scripts\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.220291 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.220403 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.221288 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.229482 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-config-data\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.250327 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq5pl\" (UniqueName: \"kubernetes.io/projected/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-kube-api-access-rq5pl\") pod \"ceilometer-0\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " pod="openstack/ceilometer-0" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.495029 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"dced9726-c642-4e64-a963-9ce584e76e9c","Type":"ContainerStarted","Data":"3ce5e6ca13d5fcba36b4c648c207fb5a0bbdb01ba09c50b76338a7dae2637cdb"} Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.495946 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"dced9726-c642-4e64-a963-9ce584e76e9c","Type":"ContainerStarted","Data":"87d738f5fa709dd48ff4741a81c396d9f1191cf48c387654a3e1403a209d70df"} Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.532908 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.278678847 podStartE2EDuration="20.532888183s" podCreationTimestamp="2026-03-09 10:01:24 +0000 UTC" firstStartedPulling="2026-03-09 10:01:26.490358684 +0000 UTC m=+3251.520559436" lastFinishedPulling="2026-03-09 10:01:42.74456802 +0000 UTC m=+3267.774768772" observedRunningTime="2026-03-09 10:01:44.524427788 +0000 UTC m=+3269.554628560" watchObservedRunningTime="2026-03-09 10:01:44.532888183 +0000 UTC m=+3269.563088925" Mar 09 10:01:44 crc kubenswrapper[4792]: I0309 10:01:44.540828 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 10:01:45 crc kubenswrapper[4792]: I0309 10:01:45.284346 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 09 10:01:45 crc kubenswrapper[4792]: I0309 10:01:45.606855 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 10:01:45 crc kubenswrapper[4792]: I0309 10:01:45.675369 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50893660-98fd-4a26-a00a-2fcc6a2c3e51" path="/var/lib/kubelet/pods/50893660-98fd-4a26-a00a-2fcc6a2c3e51/volumes" Mar 09 10:01:46 crc kubenswrapper[4792]: I0309 10:01:46.518943 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7","Type":"ContainerStarted","Data":"05a0fdaab592c166fd65b5c19da42bb340aa5a8efa669fda8c54e85d339b1d7a"} Mar 09 10:01:47 crc kubenswrapper[4792]: I0309 10:01:47.198271 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 09 10:01:47 crc kubenswrapper[4792]: I0309 10:01:47.277050 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 10:01:47 crc kubenswrapper[4792]: I0309 10:01:47.535542 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="9e8c6802-54a3-4aed-9476-3f10a2369eea" containerName="manila-scheduler" containerID="cri-o://c27ff3c013f9905435e54bd8dcc244d2c50448448f6724eedcab33feba520041" gracePeriod=30 Mar 09 10:01:47 crc kubenswrapper[4792]: I0309 10:01:47.535990 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7","Type":"ContainerStarted","Data":"0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d"} Mar 09 10:01:47 crc kubenswrapper[4792]: I0309 10:01:47.536019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7","Type":"ContainerStarted","Data":"77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5"} Mar 09 10:01:47 crc kubenswrapper[4792]: I0309 10:01:47.536480 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="9e8c6802-54a3-4aed-9476-3f10a2369eea" containerName="probe" containerID="cri-o://31082dbf96fbfee14b67f6cc5e39ad4fc21613b7aa64a75f6a02c8ba914bfcff" gracePeriod=30 Mar 09 10:01:48 crc kubenswrapper[4792]: I0309 10:01:48.547584 4792 generic.go:334] "Generic (PLEG): container finished" podID="9e8c6802-54a3-4aed-9476-3f10a2369eea" containerID="31082dbf96fbfee14b67f6cc5e39ad4fc21613b7aa64a75f6a02c8ba914bfcff" exitCode=0 Mar 09 10:01:48 crc kubenswrapper[4792]: I0309 10:01:48.547808 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"9e8c6802-54a3-4aed-9476-3f10a2369eea","Type":"ContainerDied","Data":"31082dbf96fbfee14b67f6cc5e39ad4fc21613b7aa64a75f6a02c8ba914bfcff"} Mar 09 10:01:48 crc kubenswrapper[4792]: I0309 10:01:48.554012 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7","Type":"ContainerStarted","Data":"dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be"} Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.350526 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.544952 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e8c6802-54a3-4aed-9476-3f10a2369eea-etc-machine-id\") pod \"9e8c6802-54a3-4aed-9476-3f10a2369eea\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.545026 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-config-data\") pod \"9e8c6802-54a3-4aed-9476-3f10a2369eea\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.545150 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-scripts\") pod \"9e8c6802-54a3-4aed-9476-3f10a2369eea\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.545236 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-combined-ca-bundle\") pod \"9e8c6802-54a3-4aed-9476-3f10a2369eea\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.545342 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nr2h\" (UniqueName: \"kubernetes.io/projected/9e8c6802-54a3-4aed-9476-3f10a2369eea-kube-api-access-4nr2h\") pod \"9e8c6802-54a3-4aed-9476-3f10a2369eea\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.545425 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-config-data-custom\") pod \"9e8c6802-54a3-4aed-9476-3f10a2369eea\" (UID: \"9e8c6802-54a3-4aed-9476-3f10a2369eea\") " Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.547233 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e8c6802-54a3-4aed-9476-3f10a2369eea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9e8c6802-54a3-4aed-9476-3f10a2369eea" (UID: "9e8c6802-54a3-4aed-9476-3f10a2369eea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.553699 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9e8c6802-54a3-4aed-9476-3f10a2369eea" (UID: "9e8c6802-54a3-4aed-9476-3f10a2369eea"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.554627 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-scripts" (OuterVolumeSpecName: "scripts") pod "9e8c6802-54a3-4aed-9476-3f10a2369eea" (UID: "9e8c6802-54a3-4aed-9476-3f10a2369eea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.564361 4792 generic.go:334] "Generic (PLEG): container finished" podID="9e8c6802-54a3-4aed-9476-3f10a2369eea" containerID="c27ff3c013f9905435e54bd8dcc244d2c50448448f6724eedcab33feba520041" exitCode=0 Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.564415 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"9e8c6802-54a3-4aed-9476-3f10a2369eea","Type":"ContainerDied","Data":"c27ff3c013f9905435e54bd8dcc244d2c50448448f6724eedcab33feba520041"} Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.564430 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.564462 4792 scope.go:117] "RemoveContainer" containerID="31082dbf96fbfee14b67f6cc5e39ad4fc21613b7aa64a75f6a02c8ba914bfcff" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.564448 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"9e8c6802-54a3-4aed-9476-3f10a2369eea","Type":"ContainerDied","Data":"a4fe9c3d68e2d3b3348c06a6e5ef2b65662d9fd50cbd5ea5fd7ecdfedce66335"} Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.573423 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e8c6802-54a3-4aed-9476-3f10a2369eea-kube-api-access-4nr2h" (OuterVolumeSpecName: "kube-api-access-4nr2h") pod "9e8c6802-54a3-4aed-9476-3f10a2369eea" (UID: "9e8c6802-54a3-4aed-9476-3f10a2369eea"). InnerVolumeSpecName "kube-api-access-4nr2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.647855 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.648104 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e8c6802-54a3-4aed-9476-3f10a2369eea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.648228 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.648302 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nr2h\" (UniqueName: \"kubernetes.io/projected/9e8c6802-54a3-4aed-9476-3f10a2369eea-kube-api-access-4nr2h\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.694852 4792 scope.go:117] "RemoveContainer" containerID="c27ff3c013f9905435e54bd8dcc244d2c50448448f6724eedcab33feba520041" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.696761 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e8c6802-54a3-4aed-9476-3f10a2369eea" (UID: "9e8c6802-54a3-4aed-9476-3f10a2369eea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.754712 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.805281 4792 scope.go:117] "RemoveContainer" containerID="31082dbf96fbfee14b67f6cc5e39ad4fc21613b7aa64a75f6a02c8ba914bfcff" Mar 09 10:01:49 crc kubenswrapper[4792]: E0309 10:01:49.806355 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31082dbf96fbfee14b67f6cc5e39ad4fc21613b7aa64a75f6a02c8ba914bfcff\": container with ID starting with 31082dbf96fbfee14b67f6cc5e39ad4fc21613b7aa64a75f6a02c8ba914bfcff not found: ID does not exist" containerID="31082dbf96fbfee14b67f6cc5e39ad4fc21613b7aa64a75f6a02c8ba914bfcff" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.806389 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31082dbf96fbfee14b67f6cc5e39ad4fc21613b7aa64a75f6a02c8ba914bfcff"} err="failed to get container status \"31082dbf96fbfee14b67f6cc5e39ad4fc21613b7aa64a75f6a02c8ba914bfcff\": rpc error: code = NotFound desc = could not find container \"31082dbf96fbfee14b67f6cc5e39ad4fc21613b7aa64a75f6a02c8ba914bfcff\": container with ID starting with 31082dbf96fbfee14b67f6cc5e39ad4fc21613b7aa64a75f6a02c8ba914bfcff not found: ID does not exist" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.806408 4792 scope.go:117] "RemoveContainer" containerID="c27ff3c013f9905435e54bd8dcc244d2c50448448f6724eedcab33feba520041" Mar 09 10:01:49 crc kubenswrapper[4792]: E0309 10:01:49.806621 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c27ff3c013f9905435e54bd8dcc244d2c50448448f6724eedcab33feba520041\": container with ID starting with c27ff3c013f9905435e54bd8dcc244d2c50448448f6724eedcab33feba520041 not found: ID does not exist" containerID="c27ff3c013f9905435e54bd8dcc244d2c50448448f6724eedcab33feba520041" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.806648 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c27ff3c013f9905435e54bd8dcc244d2c50448448f6724eedcab33feba520041"} err="failed to get container status \"c27ff3c013f9905435e54bd8dcc244d2c50448448f6724eedcab33feba520041\": rpc error: code = NotFound desc = could not find container \"c27ff3c013f9905435e54bd8dcc244d2c50448448f6724eedcab33feba520041\": container with ID starting with c27ff3c013f9905435e54bd8dcc244d2c50448448f6724eedcab33feba520041 not found: ID does not exist" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.833097 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-config-data" (OuterVolumeSpecName: "config-data") pod "9e8c6802-54a3-4aed-9476-3f10a2369eea" (UID: "9e8c6802-54a3-4aed-9476-3f10a2369eea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.856775 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e8c6802-54a3-4aed-9476-3f10a2369eea-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.914053 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.923492 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.949194 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 10:01:49 crc kubenswrapper[4792]: E0309 10:01:49.949751 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e8c6802-54a3-4aed-9476-3f10a2369eea" containerName="manila-scheduler" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.949768 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c6802-54a3-4aed-9476-3f10a2369eea" containerName="manila-scheduler" Mar 09 10:01:49 crc kubenswrapper[4792]: E0309 10:01:49.949784 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e8c6802-54a3-4aed-9476-3f10a2369eea" containerName="probe" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.949794 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8c6802-54a3-4aed-9476-3f10a2369eea" containerName="probe" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.950191 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e8c6802-54a3-4aed-9476-3f10a2369eea" containerName="manila-scheduler" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.950218 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e8c6802-54a3-4aed-9476-3f10a2369eea" containerName="probe" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.951961 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.958167 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.966987 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ea7216-a1e4-47b3-8303-d2af1a68f974-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.967033 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87ea7216-a1e4-47b3-8303-d2af1a68f974-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.967088 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp4vc\" (UniqueName: \"kubernetes.io/projected/87ea7216-a1e4-47b3-8303-d2af1a68f974-kube-api-access-sp4vc\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.967374 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87ea7216-a1e4-47b3-8303-d2af1a68f974-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.967539 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87ea7216-a1e4-47b3-8303-d2af1a68f974-scripts\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.967672 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87ea7216-a1e4-47b3-8303-d2af1a68f974-config-data\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:49 crc kubenswrapper[4792]: I0309 10:01:49.984244 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.070016 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87ea7216-a1e4-47b3-8303-d2af1a68f974-scripts\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.070154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87ea7216-a1e4-47b3-8303-d2af1a68f974-config-data\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.070204 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ea7216-a1e4-47b3-8303-d2af1a68f974-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.070226 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87ea7216-a1e4-47b3-8303-d2af1a68f974-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.070253 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp4vc\" (UniqueName: \"kubernetes.io/projected/87ea7216-a1e4-47b3-8303-d2af1a68f974-kube-api-access-sp4vc\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.070380 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87ea7216-a1e4-47b3-8303-d2af1a68f974-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.070518 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87ea7216-a1e4-47b3-8303-d2af1a68f974-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.074879 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87ea7216-a1e4-47b3-8303-d2af1a68f974-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.076597 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87ea7216-a1e4-47b3-8303-d2af1a68f974-scripts\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.078475 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ea7216-a1e4-47b3-8303-d2af1a68f974-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.078713 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87ea7216-a1e4-47b3-8303-d2af1a68f974-config-data\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.088756 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp4vc\" (UniqueName: \"kubernetes.io/projected/87ea7216-a1e4-47b3-8303-d2af1a68f974-kube-api-access-sp4vc\") pod \"manila-scheduler-0\" (UID: \"87ea7216-a1e4-47b3-8303-d2af1a68f974\") " pod="openstack/manila-scheduler-0" Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.273725 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 09 10:01:50 crc kubenswrapper[4792]: I0309 10:01:50.871625 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 09 10:01:50 crc kubenswrapper[4792]: W0309 10:01:50.878976 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87ea7216_a1e4_47b3_8303_d2af1a68f974.slice/crio-fa2ec0efb2cb401ac9be06a1cfca2fa075585c2a05c30575f08e38d9c35062a4 WatchSource:0}: Error finding container fa2ec0efb2cb401ac9be06a1cfca2fa075585c2a05c30575f08e38d9c35062a4: Status 404 returned error can't find the container with id fa2ec0efb2cb401ac9be06a1cfca2fa075585c2a05c30575f08e38d9c35062a4 Mar 09 10:01:51 crc kubenswrapper[4792]: I0309 10:01:51.650518 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"87ea7216-a1e4-47b3-8303-d2af1a68f974","Type":"ContainerStarted","Data":"7b4b9f5a09252c385187140a2ecdcaf54da97b9d39500d7c04bd274f97545bab"} Mar 09 10:01:51 crc kubenswrapper[4792]: I0309 10:01:51.652300 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"87ea7216-a1e4-47b3-8303-d2af1a68f974","Type":"ContainerStarted","Data":"fa2ec0efb2cb401ac9be06a1cfca2fa075585c2a05c30575f08e38d9c35062a4"} Mar 09 10:01:51 crc kubenswrapper[4792]: I0309 10:01:51.681316 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e8c6802-54a3-4aed-9476-3f10a2369eea" path="/var/lib/kubelet/pods/9e8c6802-54a3-4aed-9476-3f10a2369eea/volumes" Mar 09 10:01:51 crc kubenswrapper[4792]: I0309 10:01:51.692920 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 10:01:51 crc kubenswrapper[4792]: I0309 10:01:51.692966 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7","Type":"ContainerStarted","Data":"479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2"} Mar 09 10:01:51 crc kubenswrapper[4792]: I0309 10:01:51.715491 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.052472975 podStartE2EDuration="8.71547222s" podCreationTimestamp="2026-03-09 10:01:43 +0000 UTC" firstStartedPulling="2026-03-09 10:01:45.603682587 +0000 UTC m=+3270.633883339" lastFinishedPulling="2026-03-09 10:01:50.266681832 +0000 UTC m=+3275.296882584" observedRunningTime="2026-03-09 10:01:51.707409145 +0000 UTC m=+3276.737609897" watchObservedRunningTime="2026-03-09 10:01:51.71547222 +0000 UTC m=+3276.745672972" Mar 09 10:01:52 crc kubenswrapper[4792]: I0309 10:01:52.717923 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"87ea7216-a1e4-47b3-8303-d2af1a68f974","Type":"ContainerStarted","Data":"b07dbf406920d2292e0b49149282218bd7ab4aef60341a8e68fa9f9f80d2a34b"} Mar 09 10:01:52 crc kubenswrapper[4792]: I0309 10:01:52.825469 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:01:52 crc kubenswrapper[4792]: I0309 10:01:52.825526 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:01:54 crc kubenswrapper[4792]: I0309 10:01:54.105564 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 09 10:01:54 crc kubenswrapper[4792]: I0309 10:01:54.125489 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=5.125468614 podStartE2EDuration="5.125468614s" podCreationTimestamp="2026-03-09 10:01:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:52.758432376 +0000 UTC m=+3277.788633128" watchObservedRunningTime="2026-03-09 10:01:54.125468614 +0000 UTC m=+3279.155669366" Mar 09 10:01:54 crc kubenswrapper[4792]: I0309 10:01:54.583393 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 10:01:54 crc kubenswrapper[4792]: I0309 10:01:54.583808 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="sg-core" containerID="cri-o://dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be" gracePeriod=30 Mar 09 10:01:54 crc kubenswrapper[4792]: I0309 10:01:54.583842 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="ceilometer-notification-agent" containerID="cri-o://0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d" gracePeriod=30 Mar 09 10:01:54 crc kubenswrapper[4792]: I0309 10:01:54.583885 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="proxy-httpd" containerID="cri-o://479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2" gracePeriod=30 Mar 09 10:01:54 crc kubenswrapper[4792]: I0309 10:01:54.583842 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="ceilometer-central-agent" containerID="cri-o://77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5" gracePeriod=30 Mar 09 10:01:54 crc kubenswrapper[4792]: I0309 10:01:54.764753 4792 generic.go:334] "Generic (PLEG): container finished" podID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerID="dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be" exitCode=2 Mar 09 10:01:54 crc kubenswrapper[4792]: I0309 10:01:54.765093 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7","Type":"ContainerDied","Data":"dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be"} Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.470549 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.536740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-config-data\") pod \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.536806 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-combined-ca-bundle\") pod \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.536863 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-log-httpd\") pod \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.536899 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-ceilometer-tls-certs\") pod \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.536920 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-sg-core-conf-yaml\") pod \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.537010 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-run-httpd\") pod \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.537140 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq5pl\" (UniqueName: \"kubernetes.io/projected/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-kube-api-access-rq5pl\") pod \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.537236 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-scripts\") pod \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\" (UID: \"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7\") " Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.538444 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" (UID: "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.539122 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" (UID: "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.546338 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-kube-api-access-rq5pl" (OuterVolumeSpecName: "kube-api-access-rq5pl") pod "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" (UID: "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7"). InnerVolumeSpecName "kube-api-access-rq5pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.546473 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-scripts" (OuterVolumeSpecName: "scripts") pod "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" (UID: "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.596239 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" (UID: "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.641549 4792 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.641590 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq5pl\" (UniqueName: \"kubernetes.io/projected/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-kube-api-access-rq5pl\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.641601 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.641610 4792 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.641618 4792 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.659314 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" (UID: "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.734259 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-config-data" (OuterVolumeSpecName: "config-data") pod "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" (UID: "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.744116 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.744150 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.767321 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" (UID: "743f508e-dd6d-4c3b-b8dd-c12eeb381cd7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.813345 4792 generic.go:334] "Generic (PLEG): container finished" podID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerID="479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2" exitCode=0 Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.813385 4792 generic.go:334] "Generic (PLEG): container finished" podID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerID="0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d" exitCode=0 Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.813395 4792 generic.go:334] "Generic (PLEG): container finished" podID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerID="77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5" exitCode=0 Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.813479 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.828503 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7","Type":"ContainerDied","Data":"479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2"} Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.828843 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7","Type":"ContainerDied","Data":"0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d"} Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.828855 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7","Type":"ContainerDied","Data":"77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5"} Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.828878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"743f508e-dd6d-4c3b-b8dd-c12eeb381cd7","Type":"ContainerDied","Data":"05a0fdaab592c166fd65b5c19da42bb340aa5a8efa669fda8c54e85d339b1d7a"} Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.828895 4792 scope.go:117] "RemoveContainer" containerID="479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.845690 4792 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.854820 4792 scope.go:117] "RemoveContainer" containerID="dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.877989 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.895315 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.908486 4792 scope.go:117] "RemoveContainer" containerID="0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.931279 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 10:01:55 crc kubenswrapper[4792]: E0309 10:01:55.931699 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="proxy-httpd" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.931715 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="proxy-httpd" Mar 09 10:01:55 crc kubenswrapper[4792]: E0309 10:01:55.931725 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="sg-core" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.931731 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="sg-core" Mar 09 10:01:55 crc kubenswrapper[4792]: E0309 10:01:55.931757 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="ceilometer-central-agent" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.931764 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="ceilometer-central-agent" Mar 09 10:01:55 crc kubenswrapper[4792]: E0309 10:01:55.931789 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="ceilometer-notification-agent" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.931795 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="ceilometer-notification-agent" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.932674 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="sg-core" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.932722 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="ceilometer-notification-agent" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.932739 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="ceilometer-central-agent" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.932756 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" containerName="proxy-httpd" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.934966 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.945429 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.945608 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.956640 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.963743 4792 scope.go:117] "RemoveContainer" containerID="77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5" Mar 09 10:01:55 crc kubenswrapper[4792]: I0309 10:01:55.971687 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.000769 4792 scope.go:117] "RemoveContainer" containerID="479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2" Mar 09 10:01:56 crc kubenswrapper[4792]: E0309 10:01:56.002564 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2\": container with ID starting with 479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2 not found: ID does not exist" containerID="479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.002608 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2"} err="failed to get container status \"479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2\": rpc error: code = NotFound desc = could not find container \"479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2\": container with ID starting with 479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2 not found: ID does not exist" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.002635 4792 scope.go:117] "RemoveContainer" containerID="dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be" Mar 09 10:01:56 crc kubenswrapper[4792]: E0309 10:01:56.003917 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be\": container with ID starting with dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be not found: ID does not exist" containerID="dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.003972 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be"} err="failed to get container status \"dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be\": rpc error: code = NotFound desc = could not find container \"dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be\": container with ID starting with dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be not found: ID does not exist" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.003992 4792 scope.go:117] "RemoveContainer" containerID="0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d" Mar 09 10:01:56 crc kubenswrapper[4792]: E0309 10:01:56.006311 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d\": container with ID starting with 0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d not found: ID does not exist" containerID="0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.006411 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d"} err="failed to get container status \"0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d\": rpc error: code = NotFound desc = could not find container \"0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d\": container with ID starting with 0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d not found: ID does not exist" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.006518 4792 scope.go:117] "RemoveContainer" containerID="77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5" Mar 09 10:01:56 crc kubenswrapper[4792]: E0309 10:01:56.006821 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5\": container with ID starting with 77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5 not found: ID does not exist" containerID="77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.006953 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5"} err="failed to get container status \"77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5\": rpc error: code = NotFound desc = could not find container \"77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5\": container with ID starting with 77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5 not found: ID does not exist" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.007092 4792 scope.go:117] "RemoveContainer" containerID="479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.007419 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2"} err="failed to get container status \"479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2\": rpc error: code = NotFound desc = could not find container \"479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2\": container with ID starting with 479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2 not found: ID does not exist" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.007527 4792 scope.go:117] "RemoveContainer" containerID="dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.007885 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be"} err="failed to get container status \"dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be\": rpc error: code = NotFound desc = could not find container \"dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be\": container with ID starting with dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be not found: ID does not exist" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.008033 4792 scope.go:117] "RemoveContainer" containerID="0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.009267 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d"} err="failed to get container status \"0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d\": rpc error: code = NotFound desc = could not find container \"0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d\": container with ID starting with 0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d not found: ID does not exist" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.009409 4792 scope.go:117] "RemoveContainer" containerID="77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.010963 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5"} err="failed to get container status \"77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5\": rpc error: code = NotFound desc = could not find container \"77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5\": container with ID starting with 77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5 not found: ID does not exist" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.011109 4792 scope.go:117] "RemoveContainer" containerID="479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.011427 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2"} err="failed to get container status \"479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2\": rpc error: code = NotFound desc = could not find container \"479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2\": container with ID starting with 479d1847d3572358a2e5123a44f0f12b6c11f923e2ce1c146533716d83a1efe2 not found: ID does not exist" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.011517 4792 scope.go:117] "RemoveContainer" containerID="dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.011758 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be"} err="failed to get container status \"dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be\": rpc error: code = NotFound desc = could not find container \"dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be\": container with ID starting with dd6f15399c20c6037a01daa7e9674f73b63c6bc03061655e2b7268ac10df27be not found: ID does not exist" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.011852 4792 scope.go:117] "RemoveContainer" containerID="0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.012399 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d"} err="failed to get container status \"0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d\": rpc error: code = NotFound desc = could not find container \"0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d\": container with ID starting with 0936287e8f34e119488e35b6379c8d790360818af95d86448e817ca341a9d49d not found: ID does not exist" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.012496 4792 scope.go:117] "RemoveContainer" containerID="77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.013220 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5"} err="failed to get container status \"77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5\": rpc error: code = NotFound desc = could not find container \"77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5\": container with ID starting with 77a33a91989c03f8ee4bc9dfe799282562fcdc8a3a12a84e791885928ce477a5 not found: ID does not exist" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.051489 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.051769 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1a37964-5fdf-4a05-bde5-750f454d2987-log-httpd\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.051903 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-scripts\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.052109 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29vxp\" (UniqueName: \"kubernetes.io/projected/a1a37964-5fdf-4a05-bde5-750f454d2987-kube-api-access-29vxp\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.052218 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1a37964-5fdf-4a05-bde5-750f454d2987-run-httpd\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.052292 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-config-data\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.052371 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.052456 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.155182 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.155544 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.155728 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.155936 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1a37964-5fdf-4a05-bde5-750f454d2987-log-httpd\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.156154 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-scripts\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.156317 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29vxp\" (UniqueName: \"kubernetes.io/projected/a1a37964-5fdf-4a05-bde5-750f454d2987-kube-api-access-29vxp\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.156461 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1a37964-5fdf-4a05-bde5-750f454d2987-run-httpd\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.156570 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-config-data\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.156488 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1a37964-5fdf-4a05-bde5-750f454d2987-log-httpd\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.157217 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1a37964-5fdf-4a05-bde5-750f454d2987-run-httpd\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.162479 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.162653 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.163291 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.165117 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-scripts\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.166389 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1a37964-5fdf-4a05-bde5-750f454d2987-config-data\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.176851 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29vxp\" (UniqueName: \"kubernetes.io/projected/a1a37964-5fdf-4a05-bde5-750f454d2987-kube-api-access-29vxp\") pod \"ceilometer-0\" (UID: \"a1a37964-5fdf-4a05-bde5-750f454d2987\") " pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.304281 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.783888 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 10:01:56 crc kubenswrapper[4792]: I0309 10:01:56.832944 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1a37964-5fdf-4a05-bde5-750f454d2987","Type":"ContainerStarted","Data":"aa5dc68c8ddf3d3eca13586537ffe56d40cd43c7bb0da4640301f82e76ce461f"} Mar 09 10:01:57 crc kubenswrapper[4792]: I0309 10:01:57.644311 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 09 10:01:57 crc kubenswrapper[4792]: I0309 10:01:57.689806 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="743f508e-dd6d-4c3b-b8dd-c12eeb381cd7" path="/var/lib/kubelet/pods/743f508e-dd6d-4c3b-b8dd-c12eeb381cd7/volumes" Mar 09 10:01:57 crc kubenswrapper[4792]: I0309 10:01:57.718022 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 10:01:57 crc kubenswrapper[4792]: I0309 10:01:57.846190 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1a37964-5fdf-4a05-bde5-750f454d2987","Type":"ContainerStarted","Data":"ccf7d55bb44b0959a72bb1355979e323a48aac4f14d0c8e36fc549d7cee70253"} Mar 09 10:01:57 crc kubenswrapper[4792]: I0309 10:01:57.846359 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="dced9726-c642-4e64-a963-9ce584e76e9c" containerName="manila-share" containerID="cri-o://87d738f5fa709dd48ff4741a81c396d9f1191cf48c387654a3e1403a209d70df" gracePeriod=30 Mar 09 10:01:57 crc kubenswrapper[4792]: I0309 10:01:57.846527 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="dced9726-c642-4e64-a963-9ce584e76e9c" containerName="probe" containerID="cri-o://3ce5e6ca13d5fcba36b4c648c207fb5a0bbdb01ba09c50b76338a7dae2637cdb" gracePeriod=30 Mar 09 10:01:58 crc kubenswrapper[4792]: I0309 10:01:58.861060 4792 generic.go:334] "Generic (PLEG): container finished" podID="dced9726-c642-4e64-a963-9ce584e76e9c" containerID="3ce5e6ca13d5fcba36b4c648c207fb5a0bbdb01ba09c50b76338a7dae2637cdb" exitCode=0 Mar 09 10:01:58 crc kubenswrapper[4792]: I0309 10:01:58.861888 4792 generic.go:334] "Generic (PLEG): container finished" podID="dced9726-c642-4e64-a963-9ce584e76e9c" containerID="87d738f5fa709dd48ff4741a81c396d9f1191cf48c387654a3e1403a209d70df" exitCode=1 Mar 09 10:01:58 crc kubenswrapper[4792]: I0309 10:01:58.861335 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"dced9726-c642-4e64-a963-9ce584e76e9c","Type":"ContainerDied","Data":"3ce5e6ca13d5fcba36b4c648c207fb5a0bbdb01ba09c50b76338a7dae2637cdb"} Mar 09 10:01:58 crc kubenswrapper[4792]: I0309 10:01:58.862038 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"dced9726-c642-4e64-a963-9ce584e76e9c","Type":"ContainerDied","Data":"87d738f5fa709dd48ff4741a81c396d9f1191cf48c387654a3e1403a209d70df"} Mar 09 10:01:58 crc kubenswrapper[4792]: I0309 10:01:58.869165 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1a37964-5fdf-4a05-bde5-750f454d2987","Type":"ContainerStarted","Data":"f3cd1c51131d9112bc3044498d713fdd3db61c2ccc821b242094450498ca3aa0"} Mar 09 10:01:58 crc kubenswrapper[4792]: I0309 10:01:58.985816 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.049746 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-config-data-custom\") pod \"dced9726-c642-4e64-a963-9ce584e76e9c\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.049801 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-combined-ca-bundle\") pod \"dced9726-c642-4e64-a963-9ce584e76e9c\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.049851 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt9hq\" (UniqueName: \"kubernetes.io/projected/dced9726-c642-4e64-a963-9ce584e76e9c-kube-api-access-tt9hq\") pod \"dced9726-c642-4e64-a963-9ce584e76e9c\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.049890 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dced9726-c642-4e64-a963-9ce584e76e9c-ceph\") pod \"dced9726-c642-4e64-a963-9ce584e76e9c\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.049936 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-scripts\") pod \"dced9726-c642-4e64-a963-9ce584e76e9c\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.050026 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dced9726-c642-4e64-a963-9ce584e76e9c-etc-machine-id\") pod \"dced9726-c642-4e64-a963-9ce584e76e9c\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.050245 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dced9726-c642-4e64-a963-9ce584e76e9c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dced9726-c642-4e64-a963-9ce584e76e9c" (UID: "dced9726-c642-4e64-a963-9ce584e76e9c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.050776 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-config-data\") pod \"dced9726-c642-4e64-a963-9ce584e76e9c\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.051033 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dced9726-c642-4e64-a963-9ce584e76e9c-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "dced9726-c642-4e64-a963-9ce584e76e9c" (UID: "dced9726-c642-4e64-a963-9ce584e76e9c"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.051297 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/dced9726-c642-4e64-a963-9ce584e76e9c-var-lib-manila\") pod \"dced9726-c642-4e64-a963-9ce584e76e9c\" (UID: \"dced9726-c642-4e64-a963-9ce584e76e9c\") " Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.059467 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-scripts" (OuterVolumeSpecName: "scripts") pod "dced9726-c642-4e64-a963-9ce584e76e9c" (UID: "dced9726-c642-4e64-a963-9ce584e76e9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.059713 4792 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dced9726-c642-4e64-a963-9ce584e76e9c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.059728 4792 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/dced9726-c642-4e64-a963-9ce584e76e9c-var-lib-manila\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.059736 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.060352 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dced9726-c642-4e64-a963-9ce584e76e9c" (UID: "dced9726-c642-4e64-a963-9ce584e76e9c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.062298 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dced9726-c642-4e64-a963-9ce584e76e9c-kube-api-access-tt9hq" (OuterVolumeSpecName: "kube-api-access-tt9hq") pod "dced9726-c642-4e64-a963-9ce584e76e9c" (UID: "dced9726-c642-4e64-a963-9ce584e76e9c"). InnerVolumeSpecName "kube-api-access-tt9hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.077347 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dced9726-c642-4e64-a963-9ce584e76e9c-ceph" (OuterVolumeSpecName: "ceph") pod "dced9726-c642-4e64-a963-9ce584e76e9c" (UID: "dced9726-c642-4e64-a963-9ce584e76e9c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.162397 4792 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.162826 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt9hq\" (UniqueName: \"kubernetes.io/projected/dced9726-c642-4e64-a963-9ce584e76e9c-kube-api-access-tt9hq\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.162842 4792 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/dced9726-c642-4e64-a963-9ce584e76e9c-ceph\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.228348 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dced9726-c642-4e64-a963-9ce584e76e9c" (UID: "dced9726-c642-4e64-a963-9ce584e76e9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.264367 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.339766 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-config-data" (OuterVolumeSpecName: "config-data") pod "dced9726-c642-4e64-a963-9ce584e76e9c" (UID: "dced9726-c642-4e64-a963-9ce584e76e9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.365921 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dced9726-c642-4e64-a963-9ce584e76e9c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.882339 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"dced9726-c642-4e64-a963-9ce584e76e9c","Type":"ContainerDied","Data":"86e7aff5ad6ce34c2c1683fd7c4aa3574adfda34404d447cc83d6afb3066f023"} Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.882601 4792 scope.go:117] "RemoveContainer" containerID="3ce5e6ca13d5fcba36b4c648c207fb5a0bbdb01ba09c50b76338a7dae2637cdb" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.882708 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.888553 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1a37964-5fdf-4a05-bde5-750f454d2987","Type":"ContainerStarted","Data":"7fd2f01bb822950dcf0599fafe7a596e52d14247e1eedef226c84d4f69bcea2b"} Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.917569 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.919269 4792 scope.go:117] "RemoveContainer" containerID="87d738f5fa709dd48ff4741a81c396d9f1191cf48c387654a3e1403a209d70df" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.954274 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.966923 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 10:01:59 crc kubenswrapper[4792]: E0309 10:01:59.967932 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dced9726-c642-4e64-a963-9ce584e76e9c" containerName="probe" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.967955 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dced9726-c642-4e64-a963-9ce584e76e9c" containerName="probe" Mar 09 10:01:59 crc kubenswrapper[4792]: E0309 10:01:59.967988 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dced9726-c642-4e64-a963-9ce584e76e9c" containerName="manila-share" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.967998 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="dced9726-c642-4e64-a963-9ce584e76e9c" containerName="manila-share" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.968661 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="dced9726-c642-4e64-a963-9ce584e76e9c" containerName="probe" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.968684 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="dced9726-c642-4e64-a963-9ce584e76e9c" containerName="manila-share" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.970139 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.973303 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 09 10:01:59 crc kubenswrapper[4792]: I0309 10:01:59.978014 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.086654 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/49c2b09f-818f-461b-9ebd-bc43d6e268c6-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.086710 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49c2b09f-818f-461b-9ebd-bc43d6e268c6-ceph\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.086738 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49c2b09f-818f-461b-9ebd-bc43d6e268c6-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.086774 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49c2b09f-818f-461b-9ebd-bc43d6e268c6-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.086983 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts65r\" (UniqueName: \"kubernetes.io/projected/49c2b09f-818f-461b-9ebd-bc43d6e268c6-kube-api-access-ts65r\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.087008 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c2b09f-818f-461b-9ebd-bc43d6e268c6-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.087033 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c2b09f-818f-461b-9ebd-bc43d6e268c6-config-data\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.087059 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c2b09f-818f-461b-9ebd-bc43d6e268c6-scripts\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.158109 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550842-n6dpq"] Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.159584 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550842-n6dpq" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.164026 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.164355 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.164613 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.178946 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550842-n6dpq"] Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.189062 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c2b09f-818f-461b-9ebd-bc43d6e268c6-scripts\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.189155 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/49c2b09f-818f-461b-9ebd-bc43d6e268c6-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.189204 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49c2b09f-818f-461b-9ebd-bc43d6e268c6-ceph\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.189243 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49c2b09f-818f-461b-9ebd-bc43d6e268c6-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.189293 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49c2b09f-818f-461b-9ebd-bc43d6e268c6-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.189438 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts65r\" (UniqueName: \"kubernetes.io/projected/49c2b09f-818f-461b-9ebd-bc43d6e268c6-kube-api-access-ts65r\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.189466 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c2b09f-818f-461b-9ebd-bc43d6e268c6-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.189495 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c2b09f-818f-461b-9ebd-bc43d6e268c6-config-data\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.191147 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49c2b09f-818f-461b-9ebd-bc43d6e268c6-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.192436 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/49c2b09f-818f-461b-9ebd-bc43d6e268c6-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.194714 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c2b09f-818f-461b-9ebd-bc43d6e268c6-config-data\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.202556 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49c2b09f-818f-461b-9ebd-bc43d6e268c6-ceph\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.208152 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49c2b09f-818f-461b-9ebd-bc43d6e268c6-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.208452 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c2b09f-818f-461b-9ebd-bc43d6e268c6-scripts\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.211047 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c2b09f-818f-461b-9ebd-bc43d6e268c6-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.235022 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts65r\" (UniqueName: \"kubernetes.io/projected/49c2b09f-818f-461b-9ebd-bc43d6e268c6-kube-api-access-ts65r\") pod \"manila-share-share1-0\" (UID: \"49c2b09f-818f-461b-9ebd-bc43d6e268c6\") " pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.275576 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.291502 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgksd\" (UniqueName: \"kubernetes.io/projected/b7abaead-f598-4f9a-87ab-4e8f02067d1f-kube-api-access-bgksd\") pod \"auto-csr-approver-29550842-n6dpq\" (UID: \"b7abaead-f598-4f9a-87ab-4e8f02067d1f\") " pod="openshift-infra/auto-csr-approver-29550842-n6dpq" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.322696 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.394563 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgksd\" (UniqueName: \"kubernetes.io/projected/b7abaead-f598-4f9a-87ab-4e8f02067d1f-kube-api-access-bgksd\") pod \"auto-csr-approver-29550842-n6dpq\" (UID: \"b7abaead-f598-4f9a-87ab-4e8f02067d1f\") " pod="openshift-infra/auto-csr-approver-29550842-n6dpq" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.421628 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgksd\" (UniqueName: \"kubernetes.io/projected/b7abaead-f598-4f9a-87ab-4e8f02067d1f-kube-api-access-bgksd\") pod \"auto-csr-approver-29550842-n6dpq\" (UID: \"b7abaead-f598-4f9a-87ab-4e8f02067d1f\") " pod="openshift-infra/auto-csr-approver-29550842-n6dpq" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.478350 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550842-n6dpq" Mar 09 10:02:00 crc kubenswrapper[4792]: I0309 10:02:00.963194 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550842-n6dpq"] Mar 09 10:02:01 crc kubenswrapper[4792]: I0309 10:02:01.081819 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 09 10:02:01 crc kubenswrapper[4792]: I0309 10:02:01.675455 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dced9726-c642-4e64-a963-9ce584e76e9c" path="/var/lib/kubelet/pods/dced9726-c642-4e64-a963-9ce584e76e9c/volumes" Mar 09 10:02:01 crc kubenswrapper[4792]: I0309 10:02:01.923144 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"49c2b09f-818f-461b-9ebd-bc43d6e268c6","Type":"ContainerStarted","Data":"afad6c1924f57c3ff6c30cce8a9a032f386ea2e6b869237712c421cc637183c4"} Mar 09 10:02:01 crc kubenswrapper[4792]: I0309 10:02:01.923728 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"49c2b09f-818f-461b-9ebd-bc43d6e268c6","Type":"ContainerStarted","Data":"e94994e799c8cb2498b68bbe2d9576c0fbb02b89716a11ea0e52f79b887a46d7"} Mar 09 10:02:01 crc kubenswrapper[4792]: I0309 10:02:01.931796 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1a37964-5fdf-4a05-bde5-750f454d2987","Type":"ContainerStarted","Data":"f3e1299c4acc52530acafd0d533e5df3de0d0cc2cd65a5e84713edf2f9c7f791"} Mar 09 10:02:01 crc kubenswrapper[4792]: I0309 10:02:01.931929 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 10:02:01 crc kubenswrapper[4792]: I0309 10:02:01.933530 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550842-n6dpq" event={"ID":"b7abaead-f598-4f9a-87ab-4e8f02067d1f","Type":"ContainerStarted","Data":"dd9c8a984e5f987be36b338d6315ea37e10555d5e1a0e0607d67c697b5132f34"} Mar 09 10:02:01 crc kubenswrapper[4792]: I0309 10:02:01.971274 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.853191292 podStartE2EDuration="6.971247807s" podCreationTimestamp="2026-03-09 10:01:55 +0000 UTC" firstStartedPulling="2026-03-09 10:01:56.798143958 +0000 UTC m=+3281.828344711" lastFinishedPulling="2026-03-09 10:02:00.916200474 +0000 UTC m=+3285.946401226" observedRunningTime="2026-03-09 10:02:01.959370395 +0000 UTC m=+3286.989571147" watchObservedRunningTime="2026-03-09 10:02:01.971247807 +0000 UTC m=+3287.001448569" Mar 09 10:02:02 crc kubenswrapper[4792]: I0309 10:02:02.843599 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54c85f748d-wxdlf" podUID="d028a70e-dd9d-4b38-bb18-4cd55cd002fe" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.11:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.11:8443: connect: connection refused" Mar 09 10:02:02 crc kubenswrapper[4792]: I0309 10:02:02.975335 4792 generic.go:334] "Generic (PLEG): container finished" podID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerID="8db2115c69bab35530668a1e9f86126f8a7d265d2b553c3753f91f58c2a978ce" exitCode=137 Mar 09 10:02:02 crc kubenswrapper[4792]: I0309 10:02:02.975427 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fd9d548-2q98q" event={"ID":"24426baa-19e0-4ac0-87b6-0f5a824de578","Type":"ContainerDied","Data":"8db2115c69bab35530668a1e9f86126f8a7d265d2b553c3753f91f58c2a978ce"} Mar 09 10:02:02 crc kubenswrapper[4792]: I0309 10:02:02.982520 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550842-n6dpq" event={"ID":"b7abaead-f598-4f9a-87ab-4e8f02067d1f","Type":"ContainerStarted","Data":"6fcaec029c5c962fe9ba07f2b5dbfbd08998dd028a126b2486a64c30d95603f0"} Mar 09 10:02:02 crc kubenswrapper[4792]: I0309 10:02:02.989025 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"49c2b09f-818f-461b-9ebd-bc43d6e268c6","Type":"ContainerStarted","Data":"1b84f392eca25a2185966565d6194dead9cb3e4059e05abbba39d174a1f405f8"} Mar 09 10:02:03 crc kubenswrapper[4792]: I0309 10:02:03.009890 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550842-n6dpq" podStartSLOduration=2.023392454 podStartE2EDuration="3.009870493s" podCreationTimestamp="2026-03-09 10:02:00 +0000 UTC" firstStartedPulling="2026-03-09 10:02:00.954287243 +0000 UTC m=+3285.984487995" lastFinishedPulling="2026-03-09 10:02:01.940765282 +0000 UTC m=+3286.970966034" observedRunningTime="2026-03-09 10:02:03.001001617 +0000 UTC m=+3288.031202369" watchObservedRunningTime="2026-03-09 10:02:03.009870493 +0000 UTC m=+3288.040071235" Mar 09 10:02:03 crc kubenswrapper[4792]: E0309 10:02:03.018618 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24426baa_19e0_4ac0_87b6_0f5a824de578.slice/crio-8db2115c69bab35530668a1e9f86126f8a7d265d2b553c3753f91f58c2a978ce.scope\": RecentStats: unable to find data in memory cache]" Mar 09 10:02:03 crc kubenswrapper[4792]: I0309 10:02:03.033305 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.033287938 podStartE2EDuration="4.033287938s" podCreationTimestamp="2026-03-09 10:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:02:03.027649025 +0000 UTC m=+3288.057849787" watchObservedRunningTime="2026-03-09 10:02:03.033287938 +0000 UTC m=+3288.063488680" Mar 09 10:02:03 crc kubenswrapper[4792]: I0309 10:02:03.999106 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fd9d548-2q98q" event={"ID":"24426baa-19e0-4ac0-87b6-0f5a824de578","Type":"ContainerStarted","Data":"9e011312293d1f2560f449acc36e137cc8d2a6a8fe0ed081593a41609c24b5c2"} Mar 09 10:02:05 crc kubenswrapper[4792]: I0309 10:02:05.017106 4792 generic.go:334] "Generic (PLEG): container finished" podID="b7abaead-f598-4f9a-87ab-4e8f02067d1f" containerID="6fcaec029c5c962fe9ba07f2b5dbfbd08998dd028a126b2486a64c30d95603f0" exitCode=0 Mar 09 10:02:05 crc kubenswrapper[4792]: I0309 10:02:05.020688 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550842-n6dpq" event={"ID":"b7abaead-f598-4f9a-87ab-4e8f02067d1f","Type":"ContainerDied","Data":"6fcaec029c5c962fe9ba07f2b5dbfbd08998dd028a126b2486a64c30d95603f0"} Mar 09 10:02:06 crc kubenswrapper[4792]: I0309 10:02:06.453353 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550842-n6dpq" Mar 09 10:02:06 crc kubenswrapper[4792]: I0309 10:02:06.557451 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgksd\" (UniqueName: \"kubernetes.io/projected/b7abaead-f598-4f9a-87ab-4e8f02067d1f-kube-api-access-bgksd\") pod \"b7abaead-f598-4f9a-87ab-4e8f02067d1f\" (UID: \"b7abaead-f598-4f9a-87ab-4e8f02067d1f\") " Mar 09 10:02:06 crc kubenswrapper[4792]: I0309 10:02:06.569419 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7abaead-f598-4f9a-87ab-4e8f02067d1f-kube-api-access-bgksd" (OuterVolumeSpecName: "kube-api-access-bgksd") pod "b7abaead-f598-4f9a-87ab-4e8f02067d1f" (UID: "b7abaead-f598-4f9a-87ab-4e8f02067d1f"). InnerVolumeSpecName "kube-api-access-bgksd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:02:06 crc kubenswrapper[4792]: I0309 10:02:06.660348 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgksd\" (UniqueName: \"kubernetes.io/projected/b7abaead-f598-4f9a-87ab-4e8f02067d1f-kube-api-access-bgksd\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:07 crc kubenswrapper[4792]: I0309 10:02:07.040797 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550842-n6dpq" event={"ID":"b7abaead-f598-4f9a-87ab-4e8f02067d1f","Type":"ContainerDied","Data":"dd9c8a984e5f987be36b338d6315ea37e10555d5e1a0e0607d67c697b5132f34"} Mar 09 10:02:07 crc kubenswrapper[4792]: I0309 10:02:07.041820 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd9c8a984e5f987be36b338d6315ea37e10555d5e1a0e0607d67c697b5132f34" Mar 09 10:02:07 crc kubenswrapper[4792]: I0309 10:02:07.041107 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550842-n6dpq" Mar 09 10:02:07 crc kubenswrapper[4792]: I0309 10:02:07.150574 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550836-6xrj2"] Mar 09 10:02:07 crc kubenswrapper[4792]: I0309 10:02:07.158899 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550836-6xrj2"] Mar 09 10:02:07 crc kubenswrapper[4792]: I0309 10:02:07.675464 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5108dd62-354b-4ee4-98d2-0a48df21d76c" path="/var/lib/kubelet/pods/5108dd62-354b-4ee4-98d2-0a48df21d76c/volumes" Mar 09 10:02:10 crc kubenswrapper[4792]: I0309 10:02:10.323836 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 09 10:02:12 crc kubenswrapper[4792]: I0309 10:02:12.012217 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 09 10:02:12 crc kubenswrapper[4792]: I0309 10:02:12.662319 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:02:12 crc kubenswrapper[4792]: I0309 10:02:12.663933 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:02:15 crc kubenswrapper[4792]: I0309 10:02:15.485621 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:02:17 crc kubenswrapper[4792]: I0309 10:02:17.335614 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-54c85f748d-wxdlf" Mar 09 10:02:17 crc kubenswrapper[4792]: I0309 10:02:17.410714 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85fd9d548-2q98q"] Mar 09 10:02:17 crc kubenswrapper[4792]: I0309 10:02:17.412185 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-85fd9d548-2q98q" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon-log" containerID="cri-o://ad1a1add70b8923f0b7ada06f9c8f1ad552fcf6e3221b1cb992de05b4d3b0fae" gracePeriod=30 Mar 09 10:02:17 crc kubenswrapper[4792]: I0309 10:02:17.412384 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-85fd9d548-2q98q" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon" containerID="cri-o://9e011312293d1f2560f449acc36e137cc8d2a6a8fe0ed081593a41609c24b5c2" gracePeriod=30 Mar 09 10:02:22 crc kubenswrapper[4792]: I0309 10:02:22.525852 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 09 10:02:26 crc kubenswrapper[4792]: I0309 10:02:26.319728 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.354988 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qngvw"] Mar 09 10:02:33 crc kubenswrapper[4792]: E0309 10:02:33.356088 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7abaead-f598-4f9a-87ab-4e8f02067d1f" containerName="oc" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.356191 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7abaead-f598-4f9a-87ab-4e8f02067d1f" containerName="oc" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.356406 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7abaead-f598-4f9a-87ab-4e8f02067d1f" containerName="oc" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.359733 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.371088 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qngvw"] Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.525463 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mqgb\" (UniqueName: \"kubernetes.io/projected/09dc4275-266b-4e64-b11a-5ae4a1fd893e-kube-api-access-7mqgb\") pod \"redhat-operators-qngvw\" (UID: \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\") " pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.525515 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09dc4275-266b-4e64-b11a-5ae4a1fd893e-catalog-content\") pod \"redhat-operators-qngvw\" (UID: \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\") " pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.525613 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09dc4275-266b-4e64-b11a-5ae4a1fd893e-utilities\") pod \"redhat-operators-qngvw\" (UID: \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\") " pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.626915 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mqgb\" (UniqueName: \"kubernetes.io/projected/09dc4275-266b-4e64-b11a-5ae4a1fd893e-kube-api-access-7mqgb\") pod \"redhat-operators-qngvw\" (UID: \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\") " pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.626964 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09dc4275-266b-4e64-b11a-5ae4a1fd893e-catalog-content\") pod \"redhat-operators-qngvw\" (UID: \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\") " pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.627046 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09dc4275-266b-4e64-b11a-5ae4a1fd893e-utilities\") pod \"redhat-operators-qngvw\" (UID: \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\") " pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.627580 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09dc4275-266b-4e64-b11a-5ae4a1fd893e-catalog-content\") pod \"redhat-operators-qngvw\" (UID: \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\") " pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.627634 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09dc4275-266b-4e64-b11a-5ae4a1fd893e-utilities\") pod \"redhat-operators-qngvw\" (UID: \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\") " pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.650247 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mqgb\" (UniqueName: \"kubernetes.io/projected/09dc4275-266b-4e64-b11a-5ae4a1fd893e-kube-api-access-7mqgb\") pod \"redhat-operators-qngvw\" (UID: \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\") " pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:02:33 crc kubenswrapper[4792]: I0309 10:02:33.683778 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:02:34 crc kubenswrapper[4792]: I0309 10:02:34.169239 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qngvw"] Mar 09 10:02:34 crc kubenswrapper[4792]: I0309 10:02:34.321402 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qngvw" event={"ID":"09dc4275-266b-4e64-b11a-5ae4a1fd893e","Type":"ContainerStarted","Data":"95bc70d9db2ef45ae68ba7d879aa84aba4d3402b6760006801d7478c38dcff7f"} Mar 09 10:02:35 crc kubenswrapper[4792]: I0309 10:02:35.332904 4792 generic.go:334] "Generic (PLEG): container finished" podID="09dc4275-266b-4e64-b11a-5ae4a1fd893e" containerID="ead14a5506709bfeb88a25f31476d4e16feb65b76ad28b110bacc60c328b709c" exitCode=0 Mar 09 10:02:35 crc kubenswrapper[4792]: I0309 10:02:35.332950 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qngvw" event={"ID":"09dc4275-266b-4e64-b11a-5ae4a1fd893e","Type":"ContainerDied","Data":"ead14a5506709bfeb88a25f31476d4e16feb65b76ad28b110bacc60c328b709c"} Mar 09 10:02:37 crc kubenswrapper[4792]: I0309 10:02:37.356043 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qngvw" event={"ID":"09dc4275-266b-4e64-b11a-5ae4a1fd893e","Type":"ContainerStarted","Data":"50d09aee34bc4d9382ed72ad174b834cbe098f1ec18e09637acb5d7e71a888cb"} Mar 09 10:02:42 crc kubenswrapper[4792]: I0309 10:02:42.919694 4792 scope.go:117] "RemoveContainer" containerID="addaa2d57488cf2486de334fe002b7d168ddce48454fbcd434735a6ef143ac9d" Mar 09 10:02:43 crc kubenswrapper[4792]: I0309 10:02:43.416774 4792 generic.go:334] "Generic (PLEG): container finished" podID="09dc4275-266b-4e64-b11a-5ae4a1fd893e" containerID="50d09aee34bc4d9382ed72ad174b834cbe098f1ec18e09637acb5d7e71a888cb" exitCode=0 Mar 09 10:02:43 crc kubenswrapper[4792]: I0309 10:02:43.416839 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qngvw" event={"ID":"09dc4275-266b-4e64-b11a-5ae4a1fd893e","Type":"ContainerDied","Data":"50d09aee34bc4d9382ed72ad174b834cbe098f1ec18e09637acb5d7e71a888cb"} Mar 09 10:02:44 crc kubenswrapper[4792]: I0309 10:02:44.429132 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qngvw" event={"ID":"09dc4275-266b-4e64-b11a-5ae4a1fd893e","Type":"ContainerStarted","Data":"5cdca57b430f477afefbfccc9944fe297de8edc8be5d33f76c208a23c497e5c5"} Mar 09 10:02:44 crc kubenswrapper[4792]: I0309 10:02:44.458133 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qngvw" podStartSLOduration=2.93473649 podStartE2EDuration="11.458109007s" podCreationTimestamp="2026-03-09 10:02:33 +0000 UTC" firstStartedPulling="2026-03-09 10:02:35.340142777 +0000 UTC m=+3320.370343529" lastFinishedPulling="2026-03-09 10:02:43.863515294 +0000 UTC m=+3328.893716046" observedRunningTime="2026-03-09 10:02:44.449656881 +0000 UTC m=+3329.479857643" watchObservedRunningTime="2026-03-09 10:02:44.458109007 +0000 UTC m=+3329.488309759" Mar 09 10:02:47 crc kubenswrapper[4792]: I0309 10:02:47.510629 4792 generic.go:334] "Generic (PLEG): container finished" podID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerID="9e011312293d1f2560f449acc36e137cc8d2a6a8fe0ed081593a41609c24b5c2" exitCode=137 Mar 09 10:02:47 crc kubenswrapper[4792]: I0309 10:02:47.510809 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fd9d548-2q98q" event={"ID":"24426baa-19e0-4ac0-87b6-0f5a824de578","Type":"ContainerDied","Data":"9e011312293d1f2560f449acc36e137cc8d2a6a8fe0ed081593a41609c24b5c2"} Mar 09 10:02:47 crc kubenswrapper[4792]: I0309 10:02:47.511487 4792 scope.go:117] "RemoveContainer" containerID="8db2115c69bab35530668a1e9f86126f8a7d265d2b553c3753f91f58c2a978ce" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.022528 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.166545 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24426baa-19e0-4ac0-87b6-0f5a824de578-config-data\") pod \"24426baa-19e0-4ac0-87b6-0f5a824de578\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.166682 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-horizon-tls-certs\") pod \"24426baa-19e0-4ac0-87b6-0f5a824de578\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.166715 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24426baa-19e0-4ac0-87b6-0f5a824de578-logs\") pod \"24426baa-19e0-4ac0-87b6-0f5a824de578\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.166760 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24426baa-19e0-4ac0-87b6-0f5a824de578-scripts\") pod \"24426baa-19e0-4ac0-87b6-0f5a824de578\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.166811 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kblf\" (UniqueName: \"kubernetes.io/projected/24426baa-19e0-4ac0-87b6-0f5a824de578-kube-api-access-9kblf\") pod \"24426baa-19e0-4ac0-87b6-0f5a824de578\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.166829 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-horizon-secret-key\") pod \"24426baa-19e0-4ac0-87b6-0f5a824de578\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.166918 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-combined-ca-bundle\") pod \"24426baa-19e0-4ac0-87b6-0f5a824de578\" (UID: \"24426baa-19e0-4ac0-87b6-0f5a824de578\") " Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.167615 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24426baa-19e0-4ac0-87b6-0f5a824de578-logs" (OuterVolumeSpecName: "logs") pod "24426baa-19e0-4ac0-87b6-0f5a824de578" (UID: "24426baa-19e0-4ac0-87b6-0f5a824de578"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.174873 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "24426baa-19e0-4ac0-87b6-0f5a824de578" (UID: "24426baa-19e0-4ac0-87b6-0f5a824de578"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.183215 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24426baa-19e0-4ac0-87b6-0f5a824de578-kube-api-access-9kblf" (OuterVolumeSpecName: "kube-api-access-9kblf") pod "24426baa-19e0-4ac0-87b6-0f5a824de578" (UID: "24426baa-19e0-4ac0-87b6-0f5a824de578"). InnerVolumeSpecName "kube-api-access-9kblf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.198539 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24426baa-19e0-4ac0-87b6-0f5a824de578-config-data" (OuterVolumeSpecName: "config-data") pod "24426baa-19e0-4ac0-87b6-0f5a824de578" (UID: "24426baa-19e0-4ac0-87b6-0f5a824de578"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.202053 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24426baa-19e0-4ac0-87b6-0f5a824de578-scripts" (OuterVolumeSpecName: "scripts") pod "24426baa-19e0-4ac0-87b6-0f5a824de578" (UID: "24426baa-19e0-4ac0-87b6-0f5a824de578"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.210208 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24426baa-19e0-4ac0-87b6-0f5a824de578" (UID: "24426baa-19e0-4ac0-87b6-0f5a824de578"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.228782 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "24426baa-19e0-4ac0-87b6-0f5a824de578" (UID: "24426baa-19e0-4ac0-87b6-0f5a824de578"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.269279 4792 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.269314 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24426baa-19e0-4ac0-87b6-0f5a824de578-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.269323 4792 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.269331 4792 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24426baa-19e0-4ac0-87b6-0f5a824de578-logs\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.269340 4792 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24426baa-19e0-4ac0-87b6-0f5a824de578-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.269348 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kblf\" (UniqueName: \"kubernetes.io/projected/24426baa-19e0-4ac0-87b6-0f5a824de578-kube-api-access-9kblf\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.269359 4792 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24426baa-19e0-4ac0-87b6-0f5a824de578-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.521334 4792 generic.go:334] "Generic (PLEG): container finished" podID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerID="ad1a1add70b8923f0b7ada06f9c8f1ad552fcf6e3221b1cb992de05b4d3b0fae" exitCode=137 Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.521384 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fd9d548-2q98q" event={"ID":"24426baa-19e0-4ac0-87b6-0f5a824de578","Type":"ContainerDied","Data":"ad1a1add70b8923f0b7ada06f9c8f1ad552fcf6e3221b1cb992de05b4d3b0fae"} Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.521404 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85fd9d548-2q98q" event={"ID":"24426baa-19e0-4ac0-87b6-0f5a824de578","Type":"ContainerDied","Data":"7ada0d813d0028ce6cdd2d57ef4866a86ced1f5fcb0cdc777fe1e07c9020b9a1"} Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.521436 4792 scope.go:117] "RemoveContainer" containerID="9e011312293d1f2560f449acc36e137cc8d2a6a8fe0ed081593a41609c24b5c2" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.522169 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85fd9d548-2q98q" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.584214 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-85fd9d548-2q98q"] Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.590312 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-85fd9d548-2q98q"] Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.729420 4792 scope.go:117] "RemoveContainer" containerID="ad1a1add70b8923f0b7ada06f9c8f1ad552fcf6e3221b1cb992de05b4d3b0fae" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.764644 4792 scope.go:117] "RemoveContainer" containerID="9e011312293d1f2560f449acc36e137cc8d2a6a8fe0ed081593a41609c24b5c2" Mar 09 10:02:48 crc kubenswrapper[4792]: E0309 10:02:48.765330 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e011312293d1f2560f449acc36e137cc8d2a6a8fe0ed081593a41609c24b5c2\": container with ID starting with 9e011312293d1f2560f449acc36e137cc8d2a6a8fe0ed081593a41609c24b5c2 not found: ID does not exist" containerID="9e011312293d1f2560f449acc36e137cc8d2a6a8fe0ed081593a41609c24b5c2" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.765415 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e011312293d1f2560f449acc36e137cc8d2a6a8fe0ed081593a41609c24b5c2"} err="failed to get container status \"9e011312293d1f2560f449acc36e137cc8d2a6a8fe0ed081593a41609c24b5c2\": rpc error: code = NotFound desc = could not find container \"9e011312293d1f2560f449acc36e137cc8d2a6a8fe0ed081593a41609c24b5c2\": container with ID starting with 9e011312293d1f2560f449acc36e137cc8d2a6a8fe0ed081593a41609c24b5c2 not found: ID does not exist" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.765456 4792 scope.go:117] "RemoveContainer" containerID="ad1a1add70b8923f0b7ada06f9c8f1ad552fcf6e3221b1cb992de05b4d3b0fae" Mar 09 10:02:48 crc kubenswrapper[4792]: E0309 10:02:48.766040 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad1a1add70b8923f0b7ada06f9c8f1ad552fcf6e3221b1cb992de05b4d3b0fae\": container with ID starting with ad1a1add70b8923f0b7ada06f9c8f1ad552fcf6e3221b1cb992de05b4d3b0fae not found: ID does not exist" containerID="ad1a1add70b8923f0b7ada06f9c8f1ad552fcf6e3221b1cb992de05b4d3b0fae" Mar 09 10:02:48 crc kubenswrapper[4792]: I0309 10:02:48.766099 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad1a1add70b8923f0b7ada06f9c8f1ad552fcf6e3221b1cb992de05b4d3b0fae"} err="failed to get container status \"ad1a1add70b8923f0b7ada06f9c8f1ad552fcf6e3221b1cb992de05b4d3b0fae\": rpc error: code = NotFound desc = could not find container \"ad1a1add70b8923f0b7ada06f9c8f1ad552fcf6e3221b1cb992de05b4d3b0fae\": container with ID starting with ad1a1add70b8923f0b7ada06f9c8f1ad552fcf6e3221b1cb992de05b4d3b0fae not found: ID does not exist" Mar 09 10:02:49 crc kubenswrapper[4792]: I0309 10:02:49.675698 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" path="/var/lib/kubelet/pods/24426baa-19e0-4ac0-87b6-0f5a824de578/volumes" Mar 09 10:02:53 crc kubenswrapper[4792]: I0309 10:02:53.684146 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:02:53 crc kubenswrapper[4792]: I0309 10:02:53.684654 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:02:54 crc kubenswrapper[4792]: I0309 10:02:54.731246 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qngvw" podUID="09dc4275-266b-4e64-b11a-5ae4a1fd893e" containerName="registry-server" probeResult="failure" output=< Mar 09 10:02:54 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:02:54 crc kubenswrapper[4792]: > Mar 09 10:03:03 crc kubenswrapper[4792]: I0309 10:03:03.737926 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:03:03 crc kubenswrapper[4792]: I0309 10:03:03.799492 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:03:04 crc kubenswrapper[4792]: I0309 10:03:04.574493 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qngvw"] Mar 09 10:03:05 crc kubenswrapper[4792]: I0309 10:03:05.683961 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qngvw" podUID="09dc4275-266b-4e64-b11a-5ae4a1fd893e" containerName="registry-server" containerID="cri-o://5cdca57b430f477afefbfccc9944fe297de8edc8be5d33f76c208a23c497e5c5" gracePeriod=2 Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.186134 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.351041 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mqgb\" (UniqueName: \"kubernetes.io/projected/09dc4275-266b-4e64-b11a-5ae4a1fd893e-kube-api-access-7mqgb\") pod \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\" (UID: \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\") " Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.351247 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09dc4275-266b-4e64-b11a-5ae4a1fd893e-catalog-content\") pod \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\" (UID: \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\") " Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.351297 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09dc4275-266b-4e64-b11a-5ae4a1fd893e-utilities\") pod \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\" (UID: \"09dc4275-266b-4e64-b11a-5ae4a1fd893e\") " Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.352669 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09dc4275-266b-4e64-b11a-5ae4a1fd893e-utilities" (OuterVolumeSpecName: "utilities") pod "09dc4275-266b-4e64-b11a-5ae4a1fd893e" (UID: "09dc4275-266b-4e64-b11a-5ae4a1fd893e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.364305 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09dc4275-266b-4e64-b11a-5ae4a1fd893e-kube-api-access-7mqgb" (OuterVolumeSpecName: "kube-api-access-7mqgb") pod "09dc4275-266b-4e64-b11a-5ae4a1fd893e" (UID: "09dc4275-266b-4e64-b11a-5ae4a1fd893e"). InnerVolumeSpecName "kube-api-access-7mqgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.453673 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mqgb\" (UniqueName: \"kubernetes.io/projected/09dc4275-266b-4e64-b11a-5ae4a1fd893e-kube-api-access-7mqgb\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.453721 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09dc4275-266b-4e64-b11a-5ae4a1fd893e-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.484837 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09dc4275-266b-4e64-b11a-5ae4a1fd893e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09dc4275-266b-4e64-b11a-5ae4a1fd893e" (UID: "09dc4275-266b-4e64-b11a-5ae4a1fd893e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.557053 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09dc4275-266b-4e64-b11a-5ae4a1fd893e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.693942 4792 generic.go:334] "Generic (PLEG): container finished" podID="09dc4275-266b-4e64-b11a-5ae4a1fd893e" containerID="5cdca57b430f477afefbfccc9944fe297de8edc8be5d33f76c208a23c497e5c5" exitCode=0 Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.693996 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qngvw" event={"ID":"09dc4275-266b-4e64-b11a-5ae4a1fd893e","Type":"ContainerDied","Data":"5cdca57b430f477afefbfccc9944fe297de8edc8be5d33f76c208a23c497e5c5"} Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.694009 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qngvw" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.694028 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qngvw" event={"ID":"09dc4275-266b-4e64-b11a-5ae4a1fd893e","Type":"ContainerDied","Data":"95bc70d9db2ef45ae68ba7d879aa84aba4d3402b6760006801d7478c38dcff7f"} Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.694049 4792 scope.go:117] "RemoveContainer" containerID="5cdca57b430f477afefbfccc9944fe297de8edc8be5d33f76c208a23c497e5c5" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.732724 4792 scope.go:117] "RemoveContainer" containerID="50d09aee34bc4d9382ed72ad174b834cbe098f1ec18e09637acb5d7e71a888cb" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.738205 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qngvw"] Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.752291 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qngvw"] Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.756373 4792 scope.go:117] "RemoveContainer" containerID="ead14a5506709bfeb88a25f31476d4e16feb65b76ad28b110bacc60c328b709c" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.807643 4792 scope.go:117] "RemoveContainer" containerID="5cdca57b430f477afefbfccc9944fe297de8edc8be5d33f76c208a23c497e5c5" Mar 09 10:03:06 crc kubenswrapper[4792]: E0309 10:03:06.808172 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cdca57b430f477afefbfccc9944fe297de8edc8be5d33f76c208a23c497e5c5\": container with ID starting with 5cdca57b430f477afefbfccc9944fe297de8edc8be5d33f76c208a23c497e5c5 not found: ID does not exist" containerID="5cdca57b430f477afefbfccc9944fe297de8edc8be5d33f76c208a23c497e5c5" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.808292 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cdca57b430f477afefbfccc9944fe297de8edc8be5d33f76c208a23c497e5c5"} err="failed to get container status \"5cdca57b430f477afefbfccc9944fe297de8edc8be5d33f76c208a23c497e5c5\": rpc error: code = NotFound desc = could not find container \"5cdca57b430f477afefbfccc9944fe297de8edc8be5d33f76c208a23c497e5c5\": container with ID starting with 5cdca57b430f477afefbfccc9944fe297de8edc8be5d33f76c208a23c497e5c5 not found: ID does not exist" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.808378 4792 scope.go:117] "RemoveContainer" containerID="50d09aee34bc4d9382ed72ad174b834cbe098f1ec18e09637acb5d7e71a888cb" Mar 09 10:03:06 crc kubenswrapper[4792]: E0309 10:03:06.809089 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d09aee34bc4d9382ed72ad174b834cbe098f1ec18e09637acb5d7e71a888cb\": container with ID starting with 50d09aee34bc4d9382ed72ad174b834cbe098f1ec18e09637acb5d7e71a888cb not found: ID does not exist" containerID="50d09aee34bc4d9382ed72ad174b834cbe098f1ec18e09637acb5d7e71a888cb" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.809133 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d09aee34bc4d9382ed72ad174b834cbe098f1ec18e09637acb5d7e71a888cb"} err="failed to get container status \"50d09aee34bc4d9382ed72ad174b834cbe098f1ec18e09637acb5d7e71a888cb\": rpc error: code = NotFound desc = could not find container \"50d09aee34bc4d9382ed72ad174b834cbe098f1ec18e09637acb5d7e71a888cb\": container with ID starting with 50d09aee34bc4d9382ed72ad174b834cbe098f1ec18e09637acb5d7e71a888cb not found: ID does not exist" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.809163 4792 scope.go:117] "RemoveContainer" containerID="ead14a5506709bfeb88a25f31476d4e16feb65b76ad28b110bacc60c328b709c" Mar 09 10:03:06 crc kubenswrapper[4792]: E0309 10:03:06.809648 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead14a5506709bfeb88a25f31476d4e16feb65b76ad28b110bacc60c328b709c\": container with ID starting with ead14a5506709bfeb88a25f31476d4e16feb65b76ad28b110bacc60c328b709c not found: ID does not exist" containerID="ead14a5506709bfeb88a25f31476d4e16feb65b76ad28b110bacc60c328b709c" Mar 09 10:03:06 crc kubenswrapper[4792]: I0309 10:03:06.809809 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead14a5506709bfeb88a25f31476d4e16feb65b76ad28b110bacc60c328b709c"} err="failed to get container status \"ead14a5506709bfeb88a25f31476d4e16feb65b76ad28b110bacc60c328b709c\": rpc error: code = NotFound desc = could not find container \"ead14a5506709bfeb88a25f31476d4e16feb65b76ad28b110bacc60c328b709c\": container with ID starting with ead14a5506709bfeb88a25f31476d4e16feb65b76ad28b110bacc60c328b709c not found: ID does not exist" Mar 09 10:03:07 crc kubenswrapper[4792]: I0309 10:03:07.676720 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09dc4275-266b-4e64-b11a-5ae4a1fd893e" path="/var/lib/kubelet/pods/09dc4275-266b-4e64-b11a-5ae4a1fd893e/volumes" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.858515 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tmxq5"] Mar 09 10:03:16 crc kubenswrapper[4792]: E0309 10:03:16.859547 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09dc4275-266b-4e64-b11a-5ae4a1fd893e" containerName="registry-server" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.859566 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="09dc4275-266b-4e64-b11a-5ae4a1fd893e" containerName="registry-server" Mar 09 10:03:16 crc kubenswrapper[4792]: E0309 10:03:16.859606 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon-log" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.859614 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon-log" Mar 09 10:03:16 crc kubenswrapper[4792]: E0309 10:03:16.859627 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09dc4275-266b-4e64-b11a-5ae4a1fd893e" containerName="extract-utilities" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.859635 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="09dc4275-266b-4e64-b11a-5ae4a1fd893e" containerName="extract-utilities" Mar 09 10:03:16 crc kubenswrapper[4792]: E0309 10:03:16.859645 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09dc4275-266b-4e64-b11a-5ae4a1fd893e" containerName="extract-content" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.859653 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="09dc4275-266b-4e64-b11a-5ae4a1fd893e" containerName="extract-content" Mar 09 10:03:16 crc kubenswrapper[4792]: E0309 10:03:16.859668 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.859675 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon" Mar 09 10:03:16 crc kubenswrapper[4792]: E0309 10:03:16.859687 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.859694 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.859904 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="09dc4275-266b-4e64-b11a-5ae4a1fd893e" containerName="registry-server" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.860053 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon-log" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.860097 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.860111 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="24426baa-19e0-4ac0-87b6-0f5a824de578" containerName="horizon" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.861756 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.870042 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tmxq5"] Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.971048 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678edd14-d8e9-4d28-a228-f9ba12a219a1-catalog-content\") pod \"certified-operators-tmxq5\" (UID: \"678edd14-d8e9-4d28-a228-f9ba12a219a1\") " pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.973543 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xcwb\" (UniqueName: \"kubernetes.io/projected/678edd14-d8e9-4d28-a228-f9ba12a219a1-kube-api-access-6xcwb\") pod \"certified-operators-tmxq5\" (UID: \"678edd14-d8e9-4d28-a228-f9ba12a219a1\") " pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:16 crc kubenswrapper[4792]: I0309 10:03:16.974140 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678edd14-d8e9-4d28-a228-f9ba12a219a1-utilities\") pod \"certified-operators-tmxq5\" (UID: \"678edd14-d8e9-4d28-a228-f9ba12a219a1\") " pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:17 crc kubenswrapper[4792]: I0309 10:03:17.075622 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xcwb\" (UniqueName: \"kubernetes.io/projected/678edd14-d8e9-4d28-a228-f9ba12a219a1-kube-api-access-6xcwb\") pod \"certified-operators-tmxq5\" (UID: \"678edd14-d8e9-4d28-a228-f9ba12a219a1\") " pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:17 crc kubenswrapper[4792]: I0309 10:03:17.075775 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678edd14-d8e9-4d28-a228-f9ba12a219a1-utilities\") pod \"certified-operators-tmxq5\" (UID: \"678edd14-d8e9-4d28-a228-f9ba12a219a1\") " pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:17 crc kubenswrapper[4792]: I0309 10:03:17.075849 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678edd14-d8e9-4d28-a228-f9ba12a219a1-catalog-content\") pod \"certified-operators-tmxq5\" (UID: \"678edd14-d8e9-4d28-a228-f9ba12a219a1\") " pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:17 crc kubenswrapper[4792]: I0309 10:03:17.078589 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678edd14-d8e9-4d28-a228-f9ba12a219a1-catalog-content\") pod \"certified-operators-tmxq5\" (UID: \"678edd14-d8e9-4d28-a228-f9ba12a219a1\") " pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:17 crc kubenswrapper[4792]: I0309 10:03:17.079261 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678edd14-d8e9-4d28-a228-f9ba12a219a1-utilities\") pod \"certified-operators-tmxq5\" (UID: \"678edd14-d8e9-4d28-a228-f9ba12a219a1\") " pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:17 crc kubenswrapper[4792]: I0309 10:03:17.099050 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xcwb\" (UniqueName: \"kubernetes.io/projected/678edd14-d8e9-4d28-a228-f9ba12a219a1-kube-api-access-6xcwb\") pod \"certified-operators-tmxq5\" (UID: \"678edd14-d8e9-4d28-a228-f9ba12a219a1\") " pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:17 crc kubenswrapper[4792]: I0309 10:03:17.206316 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:17 crc kubenswrapper[4792]: I0309 10:03:17.722330 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tmxq5"] Mar 09 10:03:17 crc kubenswrapper[4792]: I0309 10:03:17.798311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmxq5" event={"ID":"678edd14-d8e9-4d28-a228-f9ba12a219a1","Type":"ContainerStarted","Data":"3f032c5a9c585c0ca12b257b4b97da58c8d645889d6f9ac84ba7d7385fab2374"} Mar 09 10:03:18 crc kubenswrapper[4792]: I0309 10:03:18.811164 4792 generic.go:334] "Generic (PLEG): container finished" podID="678edd14-d8e9-4d28-a228-f9ba12a219a1" containerID="d15d3071c018c490eb8c756b28d797ce270b8c88e1b325c557d113fe046983b8" exitCode=0 Mar 09 10:03:18 crc kubenswrapper[4792]: I0309 10:03:18.811343 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmxq5" event={"ID":"678edd14-d8e9-4d28-a228-f9ba12a219a1","Type":"ContainerDied","Data":"d15d3071c018c490eb8c756b28d797ce270b8c88e1b325c557d113fe046983b8"} Mar 09 10:03:18 crc kubenswrapper[4792]: I0309 10:03:18.813772 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 10:03:19 crc kubenswrapper[4792]: I0309 10:03:19.823221 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmxq5" event={"ID":"678edd14-d8e9-4d28-a228-f9ba12a219a1","Type":"ContainerStarted","Data":"dec9340e2cdf3c56511c833d7cf5b7c2a63d489ac08683c3b347df6829792b17"} Mar 09 10:03:22 crc kubenswrapper[4792]: I0309 10:03:22.850898 4792 generic.go:334] "Generic (PLEG): container finished" podID="678edd14-d8e9-4d28-a228-f9ba12a219a1" containerID="dec9340e2cdf3c56511c833d7cf5b7c2a63d489ac08683c3b347df6829792b17" exitCode=0 Mar 09 10:03:22 crc kubenswrapper[4792]: I0309 10:03:22.850983 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmxq5" event={"ID":"678edd14-d8e9-4d28-a228-f9ba12a219a1","Type":"ContainerDied","Data":"dec9340e2cdf3c56511c833d7cf5b7c2a63d489ac08683c3b347df6829792b17"} Mar 09 10:03:23 crc kubenswrapper[4792]: I0309 10:03:23.864702 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmxq5" event={"ID":"678edd14-d8e9-4d28-a228-f9ba12a219a1","Type":"ContainerStarted","Data":"790dd42f6730b433fae21b35f470dd66256586b7ea60016329b2dd5f3147386e"} Mar 09 10:03:23 crc kubenswrapper[4792]: I0309 10:03:23.886771 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tmxq5" podStartSLOduration=3.360518279 podStartE2EDuration="7.886754375s" podCreationTimestamp="2026-03-09 10:03:16 +0000 UTC" firstStartedPulling="2026-03-09 10:03:18.813434095 +0000 UTC m=+3363.843634847" lastFinishedPulling="2026-03-09 10:03:23.339670191 +0000 UTC m=+3368.369870943" observedRunningTime="2026-03-09 10:03:23.884155529 +0000 UTC m=+3368.914356301" watchObservedRunningTime="2026-03-09 10:03:23.886754375 +0000 UTC m=+3368.916955127" Mar 09 10:03:27 crc kubenswrapper[4792]: I0309 10:03:27.207187 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:27 crc kubenswrapper[4792]: I0309 10:03:27.207765 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:28 crc kubenswrapper[4792]: I0309 10:03:28.249742 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-tmxq5" podUID="678edd14-d8e9-4d28-a228-f9ba12a219a1" containerName="registry-server" probeResult="failure" output=< Mar 09 10:03:28 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:03:28 crc kubenswrapper[4792]: > Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.678361 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.680172 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.683592 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.683808 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.683997 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.684333 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.691738 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mgw7j" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.755482 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/152f601f-0625-4503-a057-26316d8504aa-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.755553 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/152f601f-0625-4503-a057-26316d8504aa-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.755594 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.755706 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.756167 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.756222 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/152f601f-0625-4503-a057-26316d8504aa-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.756973 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfc2t\" (UniqueName: \"kubernetes.io/projected/152f601f-0625-4503-a057-26316d8504aa-kube-api-access-qfc2t\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.758184 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.758244 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/152f601f-0625-4503-a057-26316d8504aa-config-data\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.860358 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/152f601f-0625-4503-a057-26316d8504aa-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.860883 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/152f601f-0625-4503-a057-26316d8504aa-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.860838 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/152f601f-0625-4503-a057-26316d8504aa-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.860961 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.862184 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/152f601f-0625-4503-a057-26316d8504aa-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.863630 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.863723 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.863759 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/152f601f-0625-4503-a057-26316d8504aa-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.863922 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfc2t\" (UniqueName: \"kubernetes.io/projected/152f601f-0625-4503-a057-26316d8504aa-kube-api-access-qfc2t\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.863986 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.864024 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/152f601f-0625-4503-a057-26316d8504aa-config-data\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.864140 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.865355 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/152f601f-0625-4503-a057-26316d8504aa-config-data\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.866446 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/152f601f-0625-4503-a057-26316d8504aa-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.873200 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.876048 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.885232 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.887625 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfc2t\" (UniqueName: \"kubernetes.io/projected/152f601f-0625-4503-a057-26316d8504aa-kube-api-access-qfc2t\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:35 crc kubenswrapper[4792]: I0309 10:03:35.907129 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " pod="openstack/tempest-tests-tempest" Mar 09 10:03:36 crc kubenswrapper[4792]: I0309 10:03:36.000013 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 10:03:36 crc kubenswrapper[4792]: I0309 10:03:36.464463 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 10:03:36 crc kubenswrapper[4792]: I0309 10:03:36.974633 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"152f601f-0625-4503-a057-26316d8504aa","Type":"ContainerStarted","Data":"cc926f31dda71f4dc98e8de554cd3cc9d6f7431a9fd2b8604a45c704922cc63e"} Mar 09 10:03:37 crc kubenswrapper[4792]: I0309 10:03:37.274529 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:37 crc kubenswrapper[4792]: I0309 10:03:37.335000 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:37 crc kubenswrapper[4792]: I0309 10:03:37.526574 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tmxq5"] Mar 09 10:03:39 crc kubenswrapper[4792]: I0309 10:03:39.000224 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tmxq5" podUID="678edd14-d8e9-4d28-a228-f9ba12a219a1" containerName="registry-server" containerID="cri-o://790dd42f6730b433fae21b35f470dd66256586b7ea60016329b2dd5f3147386e" gracePeriod=2 Mar 09 10:03:40 crc kubenswrapper[4792]: I0309 10:03:40.015653 4792 generic.go:334] "Generic (PLEG): container finished" podID="678edd14-d8e9-4d28-a228-f9ba12a219a1" containerID="790dd42f6730b433fae21b35f470dd66256586b7ea60016329b2dd5f3147386e" exitCode=0 Mar 09 10:03:40 crc kubenswrapper[4792]: I0309 10:03:40.015842 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmxq5" event={"ID":"678edd14-d8e9-4d28-a228-f9ba12a219a1","Type":"ContainerDied","Data":"790dd42f6730b433fae21b35f470dd66256586b7ea60016329b2dd5f3147386e"} Mar 09 10:03:43 crc kubenswrapper[4792]: I0309 10:03:43.214473 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:03:43 crc kubenswrapper[4792]: I0309 10:03:43.215012 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:03:43 crc kubenswrapper[4792]: I0309 10:03:43.350649 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:43 crc kubenswrapper[4792]: I0309 10:03:43.454156 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xcwb\" (UniqueName: \"kubernetes.io/projected/678edd14-d8e9-4d28-a228-f9ba12a219a1-kube-api-access-6xcwb\") pod \"678edd14-d8e9-4d28-a228-f9ba12a219a1\" (UID: \"678edd14-d8e9-4d28-a228-f9ba12a219a1\") " Mar 09 10:03:43 crc kubenswrapper[4792]: I0309 10:03:43.454240 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678edd14-d8e9-4d28-a228-f9ba12a219a1-utilities\") pod \"678edd14-d8e9-4d28-a228-f9ba12a219a1\" (UID: \"678edd14-d8e9-4d28-a228-f9ba12a219a1\") " Mar 09 10:03:43 crc kubenswrapper[4792]: I0309 10:03:43.454357 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678edd14-d8e9-4d28-a228-f9ba12a219a1-catalog-content\") pod \"678edd14-d8e9-4d28-a228-f9ba12a219a1\" (UID: \"678edd14-d8e9-4d28-a228-f9ba12a219a1\") " Mar 09 10:03:43 crc kubenswrapper[4792]: I0309 10:03:43.455162 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/678edd14-d8e9-4d28-a228-f9ba12a219a1-utilities" (OuterVolumeSpecName: "utilities") pod "678edd14-d8e9-4d28-a228-f9ba12a219a1" (UID: "678edd14-d8e9-4d28-a228-f9ba12a219a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:03:43 crc kubenswrapper[4792]: I0309 10:03:43.466325 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/678edd14-d8e9-4d28-a228-f9ba12a219a1-kube-api-access-6xcwb" (OuterVolumeSpecName: "kube-api-access-6xcwb") pod "678edd14-d8e9-4d28-a228-f9ba12a219a1" (UID: "678edd14-d8e9-4d28-a228-f9ba12a219a1"). InnerVolumeSpecName "kube-api-access-6xcwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:03:43 crc kubenswrapper[4792]: I0309 10:03:43.512206 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/678edd14-d8e9-4d28-a228-f9ba12a219a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "678edd14-d8e9-4d28-a228-f9ba12a219a1" (UID: "678edd14-d8e9-4d28-a228-f9ba12a219a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:03:43 crc kubenswrapper[4792]: I0309 10:03:43.556720 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xcwb\" (UniqueName: \"kubernetes.io/projected/678edd14-d8e9-4d28-a228-f9ba12a219a1-kube-api-access-6xcwb\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:43 crc kubenswrapper[4792]: I0309 10:03:43.556767 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678edd14-d8e9-4d28-a228-f9ba12a219a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:43 crc kubenswrapper[4792]: I0309 10:03:43.556780 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678edd14-d8e9-4d28-a228-f9ba12a219a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:44 crc kubenswrapper[4792]: I0309 10:03:44.089855 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tmxq5" event={"ID":"678edd14-d8e9-4d28-a228-f9ba12a219a1","Type":"ContainerDied","Data":"3f032c5a9c585c0ca12b257b4b97da58c8d645889d6f9ac84ba7d7385fab2374"} Mar 09 10:03:44 crc kubenswrapper[4792]: I0309 10:03:44.089895 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tmxq5" Mar 09 10:03:44 crc kubenswrapper[4792]: I0309 10:03:44.090288 4792 scope.go:117] "RemoveContainer" containerID="790dd42f6730b433fae21b35f470dd66256586b7ea60016329b2dd5f3147386e" Mar 09 10:03:44 crc kubenswrapper[4792]: I0309 10:03:44.123922 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tmxq5"] Mar 09 10:03:44 crc kubenswrapper[4792]: I0309 10:03:44.128911 4792 scope.go:117] "RemoveContainer" containerID="dec9340e2cdf3c56511c833d7cf5b7c2a63d489ac08683c3b347df6829792b17" Mar 09 10:03:44 crc kubenswrapper[4792]: I0309 10:03:44.136907 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tmxq5"] Mar 09 10:03:44 crc kubenswrapper[4792]: I0309 10:03:44.158948 4792 scope.go:117] "RemoveContainer" containerID="d15d3071c018c490eb8c756b28d797ce270b8c88e1b325c557d113fe046983b8" Mar 09 10:03:45 crc kubenswrapper[4792]: I0309 10:03:45.682186 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="678edd14-d8e9-4d28-a228-f9ba12a219a1" path="/var/lib/kubelet/pods/678edd14-d8e9-4d28-a228-f9ba12a219a1/volumes" Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.185210 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550844-jt4tj"] Mar 09 10:04:00 crc kubenswrapper[4792]: E0309 10:04:00.186308 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678edd14-d8e9-4d28-a228-f9ba12a219a1" containerName="extract-utilities" Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.186325 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="678edd14-d8e9-4d28-a228-f9ba12a219a1" containerName="extract-utilities" Mar 09 10:04:00 crc kubenswrapper[4792]: E0309 10:04:00.186349 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678edd14-d8e9-4d28-a228-f9ba12a219a1" containerName="extract-content" Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.186363 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="678edd14-d8e9-4d28-a228-f9ba12a219a1" containerName="extract-content" Mar 09 10:04:00 crc kubenswrapper[4792]: E0309 10:04:00.186386 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678edd14-d8e9-4d28-a228-f9ba12a219a1" containerName="registry-server" Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.186392 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="678edd14-d8e9-4d28-a228-f9ba12a219a1" containerName="registry-server" Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.186605 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="678edd14-d8e9-4d28-a228-f9ba12a219a1" containerName="registry-server" Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.187585 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550844-jt4tj" Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.191438 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.191557 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.191823 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.204466 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550844-jt4tj"] Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.250425 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x478\" (UniqueName: \"kubernetes.io/projected/6fdf0a54-c4a3-49fa-8556-1b426a7b5252-kube-api-access-6x478\") pod \"auto-csr-approver-29550844-jt4tj\" (UID: \"6fdf0a54-c4a3-49fa-8556-1b426a7b5252\") " pod="openshift-infra/auto-csr-approver-29550844-jt4tj" Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.352692 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x478\" (UniqueName: \"kubernetes.io/projected/6fdf0a54-c4a3-49fa-8556-1b426a7b5252-kube-api-access-6x478\") pod \"auto-csr-approver-29550844-jt4tj\" (UID: \"6fdf0a54-c4a3-49fa-8556-1b426a7b5252\") " pod="openshift-infra/auto-csr-approver-29550844-jt4tj" Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.375719 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x478\" (UniqueName: \"kubernetes.io/projected/6fdf0a54-c4a3-49fa-8556-1b426a7b5252-kube-api-access-6x478\") pod \"auto-csr-approver-29550844-jt4tj\" (UID: \"6fdf0a54-c4a3-49fa-8556-1b426a7b5252\") " pod="openshift-infra/auto-csr-approver-29550844-jt4tj" Mar 09 10:04:00 crc kubenswrapper[4792]: I0309 10:04:00.515452 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550844-jt4tj" Mar 09 10:04:13 crc kubenswrapper[4792]: I0309 10:04:13.214109 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:04:13 crc kubenswrapper[4792]: I0309 10:04:13.214665 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:04:21 crc kubenswrapper[4792]: E0309 10:04:21.898906 4792 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 09 10:04:21 crc kubenswrapper[4792]: E0309 10:04:21.904768 4792 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qfc2t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(152f601f-0625-4503-a057-26316d8504aa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 10:04:21 crc kubenswrapper[4792]: E0309 10:04:21.907621 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="152f601f-0625-4503-a057-26316d8504aa" Mar 09 10:04:22 crc kubenswrapper[4792]: I0309 10:04:22.487940 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550844-jt4tj"] Mar 09 10:04:22 crc kubenswrapper[4792]: I0309 10:04:22.535920 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550844-jt4tj" event={"ID":"6fdf0a54-c4a3-49fa-8556-1b426a7b5252","Type":"ContainerStarted","Data":"91c0797d88c487b8246b04e5c55e4e11ad1894c58dd2eca04b76a6654b791dad"} Mar 09 10:04:22 crc kubenswrapper[4792]: E0309 10:04:22.539664 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="152f601f-0625-4503-a057-26316d8504aa" Mar 09 10:04:24 crc kubenswrapper[4792]: I0309 10:04:24.559033 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550844-jt4tj" event={"ID":"6fdf0a54-c4a3-49fa-8556-1b426a7b5252","Type":"ContainerStarted","Data":"750bf3fe741383fe544b752ba1ca9bb8340c5d3efa4e0e2206241a750b46a229"} Mar 09 10:04:26 crc kubenswrapper[4792]: I0309 10:04:26.579630 4792 generic.go:334] "Generic (PLEG): container finished" podID="6fdf0a54-c4a3-49fa-8556-1b426a7b5252" containerID="750bf3fe741383fe544b752ba1ca9bb8340c5d3efa4e0e2206241a750b46a229" exitCode=0 Mar 09 10:04:26 crc kubenswrapper[4792]: I0309 10:04:26.579766 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550844-jt4tj" event={"ID":"6fdf0a54-c4a3-49fa-8556-1b426a7b5252","Type":"ContainerDied","Data":"750bf3fe741383fe544b752ba1ca9bb8340c5d3efa4e0e2206241a750b46a229"} Mar 09 10:04:27 crc kubenswrapper[4792]: I0309 10:04:27.985834 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550844-jt4tj" Mar 09 10:04:28 crc kubenswrapper[4792]: I0309 10:04:28.046566 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x478\" (UniqueName: \"kubernetes.io/projected/6fdf0a54-c4a3-49fa-8556-1b426a7b5252-kube-api-access-6x478\") pod \"6fdf0a54-c4a3-49fa-8556-1b426a7b5252\" (UID: \"6fdf0a54-c4a3-49fa-8556-1b426a7b5252\") " Mar 09 10:04:28 crc kubenswrapper[4792]: I0309 10:04:28.071501 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fdf0a54-c4a3-49fa-8556-1b426a7b5252-kube-api-access-6x478" (OuterVolumeSpecName: "kube-api-access-6x478") pod "6fdf0a54-c4a3-49fa-8556-1b426a7b5252" (UID: "6fdf0a54-c4a3-49fa-8556-1b426a7b5252"). InnerVolumeSpecName "kube-api-access-6x478". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:04:28 crc kubenswrapper[4792]: I0309 10:04:28.148967 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x478\" (UniqueName: \"kubernetes.io/projected/6fdf0a54-c4a3-49fa-8556-1b426a7b5252-kube-api-access-6x478\") on node \"crc\" DevicePath \"\"" Mar 09 10:04:28 crc kubenswrapper[4792]: I0309 10:04:28.597818 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550844-jt4tj" event={"ID":"6fdf0a54-c4a3-49fa-8556-1b426a7b5252","Type":"ContainerDied","Data":"91c0797d88c487b8246b04e5c55e4e11ad1894c58dd2eca04b76a6654b791dad"} Mar 09 10:04:28 crc kubenswrapper[4792]: I0309 10:04:28.597868 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91c0797d88c487b8246b04e5c55e4e11ad1894c58dd2eca04b76a6654b791dad" Mar 09 10:04:28 crc kubenswrapper[4792]: I0309 10:04:28.597875 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550844-jt4tj" Mar 09 10:04:28 crc kubenswrapper[4792]: I0309 10:04:28.668039 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550838-p4hmx"] Mar 09 10:04:28 crc kubenswrapper[4792]: I0309 10:04:28.677523 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550838-p4hmx"] Mar 09 10:04:29 crc kubenswrapper[4792]: I0309 10:04:29.677256 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32655fe9-88fd-433c-bdc9-21077ef6d0e2" path="/var/lib/kubelet/pods/32655fe9-88fd-433c-bdc9-21077ef6d0e2/volumes" Mar 09 10:04:35 crc kubenswrapper[4792]: I0309 10:04:35.173582 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 09 10:04:36 crc kubenswrapper[4792]: I0309 10:04:36.665664 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"152f601f-0625-4503-a057-26316d8504aa","Type":"ContainerStarted","Data":"ccd41eb94e758842d138600a71c55ca2bee3a90aac5a0ed874f5d6a78dc3c9d7"} Mar 09 10:04:36 crc kubenswrapper[4792]: I0309 10:04:36.698816 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.99396453 podStartE2EDuration="1m2.698788015s" podCreationTimestamp="2026-03-09 10:03:34 +0000 UTC" firstStartedPulling="2026-03-09 10:03:36.465410081 +0000 UTC m=+3381.495610823" lastFinishedPulling="2026-03-09 10:04:35.170233546 +0000 UTC m=+3440.200434308" observedRunningTime="2026-03-09 10:04:36.692763635 +0000 UTC m=+3441.722964387" watchObservedRunningTime="2026-03-09 10:04:36.698788015 +0000 UTC m=+3441.728988767" Mar 09 10:04:43 crc kubenswrapper[4792]: I0309 10:04:43.214407 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:04:43 crc kubenswrapper[4792]: I0309 10:04:43.214920 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:04:43 crc kubenswrapper[4792]: I0309 10:04:43.214965 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 10:04:43 crc kubenswrapper[4792]: I0309 10:04:43.215770 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b8ee3a4ed67871065ec447eb157a9db2536cee10ff7911e7415e58bb9ff5c63"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 10:04:43 crc kubenswrapper[4792]: I0309 10:04:43.215837 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://8b8ee3a4ed67871065ec447eb157a9db2536cee10ff7911e7415e58bb9ff5c63" gracePeriod=600 Mar 09 10:04:43 crc kubenswrapper[4792]: I0309 10:04:43.223165 4792 scope.go:117] "RemoveContainer" containerID="5f4cd8094a9485ff642ff92b857161b5eebd3768a14918b766ee958496460106" Mar 09 10:04:43 crc kubenswrapper[4792]: I0309 10:04:43.733444 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="8b8ee3a4ed67871065ec447eb157a9db2536cee10ff7911e7415e58bb9ff5c63" exitCode=0 Mar 09 10:04:43 crc kubenswrapper[4792]: I0309 10:04:43.733496 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"8b8ee3a4ed67871065ec447eb157a9db2536cee10ff7911e7415e58bb9ff5c63"} Mar 09 10:04:43 crc kubenswrapper[4792]: I0309 10:04:43.733796 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925"} Mar 09 10:04:43 crc kubenswrapper[4792]: I0309 10:04:43.733834 4792 scope.go:117] "RemoveContainer" containerID="d764681645ab8670f7435c3d7eeda989bbb6c0f3c40948420b0a6a2fc3dd7e93" Mar 09 10:05:53 crc kubenswrapper[4792]: I0309 10:05:53.963492 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-84kc9"] Mar 09 10:05:53 crc kubenswrapper[4792]: E0309 10:05:53.964531 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fdf0a54-c4a3-49fa-8556-1b426a7b5252" containerName="oc" Mar 09 10:05:53 crc kubenswrapper[4792]: I0309 10:05:53.964544 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fdf0a54-c4a3-49fa-8556-1b426a7b5252" containerName="oc" Mar 09 10:05:53 crc kubenswrapper[4792]: I0309 10:05:53.964742 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fdf0a54-c4a3-49fa-8556-1b426a7b5252" containerName="oc" Mar 09 10:05:53 crc kubenswrapper[4792]: I0309 10:05:53.966007 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:05:53 crc kubenswrapper[4792]: I0309 10:05:53.987859 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84kc9"] Mar 09 10:05:54 crc kubenswrapper[4792]: I0309 10:05:54.044531 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-catalog-content\") pod \"community-operators-84kc9\" (UID: \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\") " pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:05:54 crc kubenswrapper[4792]: I0309 10:05:54.044670 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2smj\" (UniqueName: \"kubernetes.io/projected/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-kube-api-access-n2smj\") pod \"community-operators-84kc9\" (UID: \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\") " pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:05:54 crc kubenswrapper[4792]: I0309 10:05:54.044766 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-utilities\") pod \"community-operators-84kc9\" (UID: \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\") " pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:05:54 crc kubenswrapper[4792]: I0309 10:05:54.146515 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-utilities\") pod \"community-operators-84kc9\" (UID: \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\") " pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:05:54 crc kubenswrapper[4792]: I0309 10:05:54.146637 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-catalog-content\") pod \"community-operators-84kc9\" (UID: \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\") " pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:05:54 crc kubenswrapper[4792]: I0309 10:05:54.146746 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2smj\" (UniqueName: \"kubernetes.io/projected/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-kube-api-access-n2smj\") pod \"community-operators-84kc9\" (UID: \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\") " pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:05:54 crc kubenswrapper[4792]: I0309 10:05:54.147382 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-utilities\") pod \"community-operators-84kc9\" (UID: \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\") " pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:05:54 crc kubenswrapper[4792]: I0309 10:05:54.147517 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-catalog-content\") pod \"community-operators-84kc9\" (UID: \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\") " pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:05:54 crc kubenswrapper[4792]: I0309 10:05:54.169022 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2smj\" (UniqueName: \"kubernetes.io/projected/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-kube-api-access-n2smj\") pod \"community-operators-84kc9\" (UID: \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\") " pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:05:54 crc kubenswrapper[4792]: I0309 10:05:54.303009 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:05:54 crc kubenswrapper[4792]: I0309 10:05:54.897089 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84kc9"] Mar 09 10:05:55 crc kubenswrapper[4792]: I0309 10:05:55.397939 4792 generic.go:334] "Generic (PLEG): container finished" podID="6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" containerID="380defa78b4c31695929f21316f05aff9750bd30c41c6976937db335a423f0c6" exitCode=0 Mar 09 10:05:55 crc kubenswrapper[4792]: I0309 10:05:55.397985 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84kc9" event={"ID":"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7","Type":"ContainerDied","Data":"380defa78b4c31695929f21316f05aff9750bd30c41c6976937db335a423f0c6"} Mar 09 10:05:55 crc kubenswrapper[4792]: I0309 10:05:55.398035 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84kc9" event={"ID":"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7","Type":"ContainerStarted","Data":"332e9e75ea43426a1b5e0ad055916ec467855217bfbe1daa7d56d3d594d29d51"} Mar 09 10:05:56 crc kubenswrapper[4792]: I0309 10:05:56.409764 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84kc9" event={"ID":"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7","Type":"ContainerStarted","Data":"8cd9d511c0bfeea57f33dcffb4d977cdcd59f9e161893285d7dd9c98c1835cdb"} Mar 09 10:05:58 crc kubenswrapper[4792]: I0309 10:05:58.436181 4792 generic.go:334] "Generic (PLEG): container finished" podID="6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" containerID="8cd9d511c0bfeea57f33dcffb4d977cdcd59f9e161893285d7dd9c98c1835cdb" exitCode=0 Mar 09 10:05:58 crc kubenswrapper[4792]: I0309 10:05:58.436277 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84kc9" event={"ID":"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7","Type":"ContainerDied","Data":"8cd9d511c0bfeea57f33dcffb4d977cdcd59f9e161893285d7dd9c98c1835cdb"} Mar 09 10:05:59 crc kubenswrapper[4792]: I0309 10:05:59.449257 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84kc9" event={"ID":"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7","Type":"ContainerStarted","Data":"9b66b2868d1b8d5861c751160b97f144e7879025116fb824bc2bd3a228a17e63"} Mar 09 10:05:59 crc kubenswrapper[4792]: I0309 10:05:59.482596 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-84kc9" podStartSLOduration=3.009642971 podStartE2EDuration="6.482571328s" podCreationTimestamp="2026-03-09 10:05:53 +0000 UTC" firstStartedPulling="2026-03-09 10:05:55.399626071 +0000 UTC m=+3520.429826823" lastFinishedPulling="2026-03-09 10:05:58.872554428 +0000 UTC m=+3523.902755180" observedRunningTime="2026-03-09 10:05:59.473903787 +0000 UTC m=+3524.504104559" watchObservedRunningTime="2026-03-09 10:05:59.482571328 +0000 UTC m=+3524.512772090" Mar 09 10:06:00 crc kubenswrapper[4792]: I0309 10:06:00.147635 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550846-vdj9q"] Mar 09 10:06:00 crc kubenswrapper[4792]: I0309 10:06:00.149355 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550846-vdj9q" Mar 09 10:06:00 crc kubenswrapper[4792]: I0309 10:06:00.177889 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcs96\" (UniqueName: \"kubernetes.io/projected/6390ec2d-07a7-446b-ad3f-38a397d3a436-kube-api-access-jcs96\") pod \"auto-csr-approver-29550846-vdj9q\" (UID: \"6390ec2d-07a7-446b-ad3f-38a397d3a436\") " pod="openshift-infra/auto-csr-approver-29550846-vdj9q" Mar 09 10:06:00 crc kubenswrapper[4792]: I0309 10:06:00.182555 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:06:00 crc kubenswrapper[4792]: I0309 10:06:00.182907 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:06:00 crc kubenswrapper[4792]: I0309 10:06:00.183396 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:06:00 crc kubenswrapper[4792]: I0309 10:06:00.187496 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550846-vdj9q"] Mar 09 10:06:00 crc kubenswrapper[4792]: I0309 10:06:00.279532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcs96\" (UniqueName: \"kubernetes.io/projected/6390ec2d-07a7-446b-ad3f-38a397d3a436-kube-api-access-jcs96\") pod \"auto-csr-approver-29550846-vdj9q\" (UID: \"6390ec2d-07a7-446b-ad3f-38a397d3a436\") " pod="openshift-infra/auto-csr-approver-29550846-vdj9q" Mar 09 10:06:00 crc kubenswrapper[4792]: I0309 10:06:00.317442 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcs96\" (UniqueName: \"kubernetes.io/projected/6390ec2d-07a7-446b-ad3f-38a397d3a436-kube-api-access-jcs96\") pod \"auto-csr-approver-29550846-vdj9q\" (UID: \"6390ec2d-07a7-446b-ad3f-38a397d3a436\") " pod="openshift-infra/auto-csr-approver-29550846-vdj9q" Mar 09 10:06:00 crc kubenswrapper[4792]: I0309 10:06:00.483310 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550846-vdj9q" Mar 09 10:06:00 crc kubenswrapper[4792]: I0309 10:06:00.985725 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550846-vdj9q"] Mar 09 10:06:01 crc kubenswrapper[4792]: I0309 10:06:01.467681 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550846-vdj9q" event={"ID":"6390ec2d-07a7-446b-ad3f-38a397d3a436","Type":"ContainerStarted","Data":"463735dceecf68cab0b2693ccaeeea3806ab336d74c2c32fa46f2bfca54ae799"} Mar 09 10:06:02 crc kubenswrapper[4792]: I0309 10:06:02.486955 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550846-vdj9q" event={"ID":"6390ec2d-07a7-446b-ad3f-38a397d3a436","Type":"ContainerStarted","Data":"398fe95a0a57611932be7a1180c1f118048ee29c9bbf357fb8488d8b8994830b"} Mar 09 10:06:02 crc kubenswrapper[4792]: I0309 10:06:02.509123 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550846-vdj9q" podStartSLOduration=1.4196324009999999 podStartE2EDuration="2.509072123s" podCreationTimestamp="2026-03-09 10:06:00 +0000 UTC" firstStartedPulling="2026-03-09 10:06:00.994557776 +0000 UTC m=+3526.024758528" lastFinishedPulling="2026-03-09 10:06:02.083997498 +0000 UTC m=+3527.114198250" observedRunningTime="2026-03-09 10:06:02.499909089 +0000 UTC m=+3527.530109841" watchObservedRunningTime="2026-03-09 10:06:02.509072123 +0000 UTC m=+3527.539272875" Mar 09 10:06:03 crc kubenswrapper[4792]: I0309 10:06:03.496202 4792 generic.go:334] "Generic (PLEG): container finished" podID="6390ec2d-07a7-446b-ad3f-38a397d3a436" containerID="398fe95a0a57611932be7a1180c1f118048ee29c9bbf357fb8488d8b8994830b" exitCode=0 Mar 09 10:06:03 crc kubenswrapper[4792]: I0309 10:06:03.496398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550846-vdj9q" event={"ID":"6390ec2d-07a7-446b-ad3f-38a397d3a436","Type":"ContainerDied","Data":"398fe95a0a57611932be7a1180c1f118048ee29c9bbf357fb8488d8b8994830b"} Mar 09 10:06:04 crc kubenswrapper[4792]: I0309 10:06:04.303580 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:06:04 crc kubenswrapper[4792]: I0309 10:06:04.303644 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:06:04 crc kubenswrapper[4792]: I0309 10:06:04.900432 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550846-vdj9q" Mar 09 10:06:04 crc kubenswrapper[4792]: I0309 10:06:04.970265 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcs96\" (UniqueName: \"kubernetes.io/projected/6390ec2d-07a7-446b-ad3f-38a397d3a436-kube-api-access-jcs96\") pod \"6390ec2d-07a7-446b-ad3f-38a397d3a436\" (UID: \"6390ec2d-07a7-446b-ad3f-38a397d3a436\") " Mar 09 10:06:04 crc kubenswrapper[4792]: I0309 10:06:04.977414 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6390ec2d-07a7-446b-ad3f-38a397d3a436-kube-api-access-jcs96" (OuterVolumeSpecName: "kube-api-access-jcs96") pod "6390ec2d-07a7-446b-ad3f-38a397d3a436" (UID: "6390ec2d-07a7-446b-ad3f-38a397d3a436"). InnerVolumeSpecName "kube-api-access-jcs96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:06:05 crc kubenswrapper[4792]: I0309 10:06:05.071734 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcs96\" (UniqueName: \"kubernetes.io/projected/6390ec2d-07a7-446b-ad3f-38a397d3a436-kube-api-access-jcs96\") on node \"crc\" DevicePath \"\"" Mar 09 10:06:05 crc kubenswrapper[4792]: I0309 10:06:05.349152 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-84kc9" podUID="6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" containerName="registry-server" probeResult="failure" output=< Mar 09 10:06:05 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:06:05 crc kubenswrapper[4792]: > Mar 09 10:06:05 crc kubenswrapper[4792]: I0309 10:06:05.515458 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550846-vdj9q" event={"ID":"6390ec2d-07a7-446b-ad3f-38a397d3a436","Type":"ContainerDied","Data":"463735dceecf68cab0b2693ccaeeea3806ab336d74c2c32fa46f2bfca54ae799"} Mar 09 10:06:05 crc kubenswrapper[4792]: I0309 10:06:05.515503 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="463735dceecf68cab0b2693ccaeeea3806ab336d74c2c32fa46f2bfca54ae799" Mar 09 10:06:05 crc kubenswrapper[4792]: I0309 10:06:05.515583 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550846-vdj9q" Mar 09 10:06:05 crc kubenswrapper[4792]: I0309 10:06:05.586467 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550840-7q7wm"] Mar 09 10:06:05 crc kubenswrapper[4792]: I0309 10:06:05.595922 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550840-7q7wm"] Mar 09 10:06:05 crc kubenswrapper[4792]: I0309 10:06:05.673651 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edc07ac6-aa9a-477f-a990-62d1cbb5b7d1" path="/var/lib/kubelet/pods/edc07ac6-aa9a-477f-a990-62d1cbb5b7d1/volumes" Mar 09 10:06:14 crc kubenswrapper[4792]: I0309 10:06:14.348880 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:06:14 crc kubenswrapper[4792]: I0309 10:06:14.402456 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:06:14 crc kubenswrapper[4792]: I0309 10:06:14.584020 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84kc9"] Mar 09 10:06:15 crc kubenswrapper[4792]: I0309 10:06:15.769937 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-84kc9" podUID="6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" containerName="registry-server" containerID="cri-o://9b66b2868d1b8d5861c751160b97f144e7879025116fb824bc2bd3a228a17e63" gracePeriod=2 Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.286271 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.359630 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-utilities\") pod \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\" (UID: \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\") " Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.360150 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2smj\" (UniqueName: \"kubernetes.io/projected/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-kube-api-access-n2smj\") pod \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\" (UID: \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\") " Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.360430 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-catalog-content\") pod \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\" (UID: \"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7\") " Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.360856 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-utilities" (OuterVolumeSpecName: "utilities") pod "6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" (UID: "6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.362046 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.403007 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-kube-api-access-n2smj" (OuterVolumeSpecName: "kube-api-access-n2smj") pod "6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" (UID: "6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7"). InnerVolumeSpecName "kube-api-access-n2smj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.438054 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" (UID: "6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.463312 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2smj\" (UniqueName: \"kubernetes.io/projected/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-kube-api-access-n2smj\") on node \"crc\" DevicePath \"\"" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.463343 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.780542 4792 generic.go:334] "Generic (PLEG): container finished" podID="6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" containerID="9b66b2868d1b8d5861c751160b97f144e7879025116fb824bc2bd3a228a17e63" exitCode=0 Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.780631 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84kc9" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.780675 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84kc9" event={"ID":"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7","Type":"ContainerDied","Data":"9b66b2868d1b8d5861c751160b97f144e7879025116fb824bc2bd3a228a17e63"} Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.780736 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84kc9" event={"ID":"6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7","Type":"ContainerDied","Data":"332e9e75ea43426a1b5e0ad055916ec467855217bfbe1daa7d56d3d594d29d51"} Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.780761 4792 scope.go:117] "RemoveContainer" containerID="9b66b2868d1b8d5861c751160b97f144e7879025116fb824bc2bd3a228a17e63" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.805319 4792 scope.go:117] "RemoveContainer" containerID="8cd9d511c0bfeea57f33dcffb4d977cdcd59f9e161893285d7dd9c98c1835cdb" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.826300 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84kc9"] Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.832004 4792 scope.go:117] "RemoveContainer" containerID="380defa78b4c31695929f21316f05aff9750bd30c41c6976937db335a423f0c6" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.836153 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-84kc9"] Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.881041 4792 scope.go:117] "RemoveContainer" containerID="9b66b2868d1b8d5861c751160b97f144e7879025116fb824bc2bd3a228a17e63" Mar 09 10:06:16 crc kubenswrapper[4792]: E0309 10:06:16.881907 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b66b2868d1b8d5861c751160b97f144e7879025116fb824bc2bd3a228a17e63\": container with ID starting with 9b66b2868d1b8d5861c751160b97f144e7879025116fb824bc2bd3a228a17e63 not found: ID does not exist" containerID="9b66b2868d1b8d5861c751160b97f144e7879025116fb824bc2bd3a228a17e63" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.881958 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b66b2868d1b8d5861c751160b97f144e7879025116fb824bc2bd3a228a17e63"} err="failed to get container status \"9b66b2868d1b8d5861c751160b97f144e7879025116fb824bc2bd3a228a17e63\": rpc error: code = NotFound desc = could not find container \"9b66b2868d1b8d5861c751160b97f144e7879025116fb824bc2bd3a228a17e63\": container with ID starting with 9b66b2868d1b8d5861c751160b97f144e7879025116fb824bc2bd3a228a17e63 not found: ID does not exist" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.881991 4792 scope.go:117] "RemoveContainer" containerID="8cd9d511c0bfeea57f33dcffb4d977cdcd59f9e161893285d7dd9c98c1835cdb" Mar 09 10:06:16 crc kubenswrapper[4792]: E0309 10:06:16.882504 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cd9d511c0bfeea57f33dcffb4d977cdcd59f9e161893285d7dd9c98c1835cdb\": container with ID starting with 8cd9d511c0bfeea57f33dcffb4d977cdcd59f9e161893285d7dd9c98c1835cdb not found: ID does not exist" containerID="8cd9d511c0bfeea57f33dcffb4d977cdcd59f9e161893285d7dd9c98c1835cdb" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.882542 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cd9d511c0bfeea57f33dcffb4d977cdcd59f9e161893285d7dd9c98c1835cdb"} err="failed to get container status \"8cd9d511c0bfeea57f33dcffb4d977cdcd59f9e161893285d7dd9c98c1835cdb\": rpc error: code = NotFound desc = could not find container \"8cd9d511c0bfeea57f33dcffb4d977cdcd59f9e161893285d7dd9c98c1835cdb\": container with ID starting with 8cd9d511c0bfeea57f33dcffb4d977cdcd59f9e161893285d7dd9c98c1835cdb not found: ID does not exist" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.882576 4792 scope.go:117] "RemoveContainer" containerID="380defa78b4c31695929f21316f05aff9750bd30c41c6976937db335a423f0c6" Mar 09 10:06:16 crc kubenswrapper[4792]: E0309 10:06:16.883174 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"380defa78b4c31695929f21316f05aff9750bd30c41c6976937db335a423f0c6\": container with ID starting with 380defa78b4c31695929f21316f05aff9750bd30c41c6976937db335a423f0c6 not found: ID does not exist" containerID="380defa78b4c31695929f21316f05aff9750bd30c41c6976937db335a423f0c6" Mar 09 10:06:16 crc kubenswrapper[4792]: I0309 10:06:16.883203 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"380defa78b4c31695929f21316f05aff9750bd30c41c6976937db335a423f0c6"} err="failed to get container status \"380defa78b4c31695929f21316f05aff9750bd30c41c6976937db335a423f0c6\": rpc error: code = NotFound desc = could not find container \"380defa78b4c31695929f21316f05aff9750bd30c41c6976937db335a423f0c6\": container with ID starting with 380defa78b4c31695929f21316f05aff9750bd30c41c6976937db335a423f0c6 not found: ID does not exist" Mar 09 10:06:17 crc kubenswrapper[4792]: I0309 10:06:17.673803 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" path="/var/lib/kubelet/pods/6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7/volumes" Mar 09 10:06:43 crc kubenswrapper[4792]: I0309 10:06:43.214878 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:06:43 crc kubenswrapper[4792]: I0309 10:06:43.215529 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:06:43 crc kubenswrapper[4792]: I0309 10:06:43.353337 4792 scope.go:117] "RemoveContainer" containerID="ccb893011016c45ea0f02e0e75699909f8e01bae5f90cccb437e81aa97742ee6" Mar 09 10:07:13 crc kubenswrapper[4792]: I0309 10:07:13.214752 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:07:13 crc kubenswrapper[4792]: I0309 10:07:13.215309 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.638146 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-frmgj"] Mar 09 10:07:24 crc kubenswrapper[4792]: E0309 10:07:24.640121 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" containerName="registry-server" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.640144 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" containerName="registry-server" Mar 09 10:07:24 crc kubenswrapper[4792]: E0309 10:07:24.640170 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" containerName="extract-content" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.640178 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" containerName="extract-content" Mar 09 10:07:24 crc kubenswrapper[4792]: E0309 10:07:24.640203 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" containerName="extract-utilities" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.640213 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" containerName="extract-utilities" Mar 09 10:07:24 crc kubenswrapper[4792]: E0309 10:07:24.640235 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6390ec2d-07a7-446b-ad3f-38a397d3a436" containerName="oc" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.640243 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6390ec2d-07a7-446b-ad3f-38a397d3a436" containerName="oc" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.640486 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1aafdd-3ebf-42b1-b286-1be4f89d7cb7" containerName="registry-server" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.640511 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6390ec2d-07a7-446b-ad3f-38a397d3a436" containerName="oc" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.642203 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.651116 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-frmgj"] Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.828957 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-utilities\") pod \"redhat-marketplace-frmgj\" (UID: \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\") " pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.829013 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-catalog-content\") pod \"redhat-marketplace-frmgj\" (UID: \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\") " pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.829086 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsbbj\" (UniqueName: \"kubernetes.io/projected/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-kube-api-access-tsbbj\") pod \"redhat-marketplace-frmgj\" (UID: \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\") " pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.930672 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-utilities\") pod \"redhat-marketplace-frmgj\" (UID: \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\") " pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.931017 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-catalog-content\") pod \"redhat-marketplace-frmgj\" (UID: \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\") " pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.931211 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsbbj\" (UniqueName: \"kubernetes.io/projected/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-kube-api-access-tsbbj\") pod \"redhat-marketplace-frmgj\" (UID: \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\") " pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.931298 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-utilities\") pod \"redhat-marketplace-frmgj\" (UID: \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\") " pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.931493 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-catalog-content\") pod \"redhat-marketplace-frmgj\" (UID: \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\") " pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:24 crc kubenswrapper[4792]: I0309 10:07:24.966565 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsbbj\" (UniqueName: \"kubernetes.io/projected/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-kube-api-access-tsbbj\") pod \"redhat-marketplace-frmgj\" (UID: \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\") " pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:25 crc kubenswrapper[4792]: I0309 10:07:25.263121 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:25 crc kubenswrapper[4792]: I0309 10:07:25.762147 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-frmgj"] Mar 09 10:07:26 crc kubenswrapper[4792]: I0309 10:07:26.184532 4792 generic.go:334] "Generic (PLEG): container finished" podID="4a7d0181-3a71-43d0-ba11-dddb2e9e9293" containerID="ddee3a70d9eb655309e360d54909883cfd9d000c26f99891208c0b6f054976d1" exitCode=0 Mar 09 10:07:26 crc kubenswrapper[4792]: I0309 10:07:26.184604 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frmgj" event={"ID":"4a7d0181-3a71-43d0-ba11-dddb2e9e9293","Type":"ContainerDied","Data":"ddee3a70d9eb655309e360d54909883cfd9d000c26f99891208c0b6f054976d1"} Mar 09 10:07:26 crc kubenswrapper[4792]: I0309 10:07:26.184971 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frmgj" event={"ID":"4a7d0181-3a71-43d0-ba11-dddb2e9e9293","Type":"ContainerStarted","Data":"ff909bad01f6d465357f37b57638feb93dfdb4e6fbcdd4857c95ac89281c65db"} Mar 09 10:07:28 crc kubenswrapper[4792]: I0309 10:07:28.208052 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frmgj" event={"ID":"4a7d0181-3a71-43d0-ba11-dddb2e9e9293","Type":"ContainerStarted","Data":"665afc576532c123cb0175e2cc48a982b92ea927d0924539df7614bce7bfda51"} Mar 09 10:07:29 crc kubenswrapper[4792]: I0309 10:07:29.217754 4792 generic.go:334] "Generic (PLEG): container finished" podID="4a7d0181-3a71-43d0-ba11-dddb2e9e9293" containerID="665afc576532c123cb0175e2cc48a982b92ea927d0924539df7614bce7bfda51" exitCode=0 Mar 09 10:07:29 crc kubenswrapper[4792]: I0309 10:07:29.217797 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frmgj" event={"ID":"4a7d0181-3a71-43d0-ba11-dddb2e9e9293","Type":"ContainerDied","Data":"665afc576532c123cb0175e2cc48a982b92ea927d0924539df7614bce7bfda51"} Mar 09 10:07:30 crc kubenswrapper[4792]: I0309 10:07:30.231891 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frmgj" event={"ID":"4a7d0181-3a71-43d0-ba11-dddb2e9e9293","Type":"ContainerStarted","Data":"ece63d0c57e5f545e8c051dd9d616a2295480c12bc85895967cb1cc3e0f8040e"} Mar 09 10:07:30 crc kubenswrapper[4792]: I0309 10:07:30.251821 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-frmgj" podStartSLOduration=2.806681903 podStartE2EDuration="6.251801318s" podCreationTimestamp="2026-03-09 10:07:24 +0000 UTC" firstStartedPulling="2026-03-09 10:07:26.186754848 +0000 UTC m=+3611.216955600" lastFinishedPulling="2026-03-09 10:07:29.631874263 +0000 UTC m=+3614.662075015" observedRunningTime="2026-03-09 10:07:30.248616663 +0000 UTC m=+3615.278817425" watchObservedRunningTime="2026-03-09 10:07:30.251801318 +0000 UTC m=+3615.282002070" Mar 09 10:07:35 crc kubenswrapper[4792]: I0309 10:07:35.264301 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:35 crc kubenswrapper[4792]: I0309 10:07:35.264857 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:35 crc kubenswrapper[4792]: I0309 10:07:35.309991 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:36 crc kubenswrapper[4792]: I0309 10:07:36.336734 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:36 crc kubenswrapper[4792]: I0309 10:07:36.397934 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-frmgj"] Mar 09 10:07:38 crc kubenswrapper[4792]: I0309 10:07:38.296350 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-frmgj" podUID="4a7d0181-3a71-43d0-ba11-dddb2e9e9293" containerName="registry-server" containerID="cri-o://ece63d0c57e5f545e8c051dd9d616a2295480c12bc85895967cb1cc3e0f8040e" gracePeriod=2 Mar 09 10:07:38 crc kubenswrapper[4792]: I0309 10:07:38.872469 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:38 crc kubenswrapper[4792]: I0309 10:07:38.933471 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-utilities\") pod \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\" (UID: \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\") " Mar 09 10:07:38 crc kubenswrapper[4792]: I0309 10:07:38.933588 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsbbj\" (UniqueName: \"kubernetes.io/projected/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-kube-api-access-tsbbj\") pod \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\" (UID: \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\") " Mar 09 10:07:38 crc kubenswrapper[4792]: I0309 10:07:38.933680 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-catalog-content\") pod \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\" (UID: \"4a7d0181-3a71-43d0-ba11-dddb2e9e9293\") " Mar 09 10:07:38 crc kubenswrapper[4792]: I0309 10:07:38.934459 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-utilities" (OuterVolumeSpecName: "utilities") pod "4a7d0181-3a71-43d0-ba11-dddb2e9e9293" (UID: "4a7d0181-3a71-43d0-ba11-dddb2e9e9293"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:07:38 crc kubenswrapper[4792]: I0309 10:07:38.935509 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:07:38 crc kubenswrapper[4792]: I0309 10:07:38.954348 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-kube-api-access-tsbbj" (OuterVolumeSpecName: "kube-api-access-tsbbj") pod "4a7d0181-3a71-43d0-ba11-dddb2e9e9293" (UID: "4a7d0181-3a71-43d0-ba11-dddb2e9e9293"). InnerVolumeSpecName "kube-api-access-tsbbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:07:38 crc kubenswrapper[4792]: I0309 10:07:38.967597 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a7d0181-3a71-43d0-ba11-dddb2e9e9293" (UID: "4a7d0181-3a71-43d0-ba11-dddb2e9e9293"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.037887 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.037940 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsbbj\" (UniqueName: \"kubernetes.io/projected/4a7d0181-3a71-43d0-ba11-dddb2e9e9293-kube-api-access-tsbbj\") on node \"crc\" DevicePath \"\"" Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.305687 4792 generic.go:334] "Generic (PLEG): container finished" podID="4a7d0181-3a71-43d0-ba11-dddb2e9e9293" containerID="ece63d0c57e5f545e8c051dd9d616a2295480c12bc85895967cb1cc3e0f8040e" exitCode=0 Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.305735 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frmgj" event={"ID":"4a7d0181-3a71-43d0-ba11-dddb2e9e9293","Type":"ContainerDied","Data":"ece63d0c57e5f545e8c051dd9d616a2295480c12bc85895967cb1cc3e0f8040e"} Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.305764 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-frmgj" event={"ID":"4a7d0181-3a71-43d0-ba11-dddb2e9e9293","Type":"ContainerDied","Data":"ff909bad01f6d465357f37b57638feb93dfdb4e6fbcdd4857c95ac89281c65db"} Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.305768 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-frmgj" Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.305782 4792 scope.go:117] "RemoveContainer" containerID="ece63d0c57e5f545e8c051dd9d616a2295480c12bc85895967cb1cc3e0f8040e" Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.334494 4792 scope.go:117] "RemoveContainer" containerID="665afc576532c123cb0175e2cc48a982b92ea927d0924539df7614bce7bfda51" Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.361421 4792 scope.go:117] "RemoveContainer" containerID="ddee3a70d9eb655309e360d54909883cfd9d000c26f99891208c0b6f054976d1" Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.361726 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-frmgj"] Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.372253 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-frmgj"] Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.415524 4792 scope.go:117] "RemoveContainer" containerID="ece63d0c57e5f545e8c051dd9d616a2295480c12bc85895967cb1cc3e0f8040e" Mar 09 10:07:39 crc kubenswrapper[4792]: E0309 10:07:39.416281 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ece63d0c57e5f545e8c051dd9d616a2295480c12bc85895967cb1cc3e0f8040e\": container with ID starting with ece63d0c57e5f545e8c051dd9d616a2295480c12bc85895967cb1cc3e0f8040e not found: ID does not exist" containerID="ece63d0c57e5f545e8c051dd9d616a2295480c12bc85895967cb1cc3e0f8040e" Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.416314 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ece63d0c57e5f545e8c051dd9d616a2295480c12bc85895967cb1cc3e0f8040e"} err="failed to get container status \"ece63d0c57e5f545e8c051dd9d616a2295480c12bc85895967cb1cc3e0f8040e\": rpc error: code = NotFound desc = could not find container \"ece63d0c57e5f545e8c051dd9d616a2295480c12bc85895967cb1cc3e0f8040e\": container with ID starting with ece63d0c57e5f545e8c051dd9d616a2295480c12bc85895967cb1cc3e0f8040e not found: ID does not exist" Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.416340 4792 scope.go:117] "RemoveContainer" containerID="665afc576532c123cb0175e2cc48a982b92ea927d0924539df7614bce7bfda51" Mar 09 10:07:39 crc kubenswrapper[4792]: E0309 10:07:39.416670 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665afc576532c123cb0175e2cc48a982b92ea927d0924539df7614bce7bfda51\": container with ID starting with 665afc576532c123cb0175e2cc48a982b92ea927d0924539df7614bce7bfda51 not found: ID does not exist" containerID="665afc576532c123cb0175e2cc48a982b92ea927d0924539df7614bce7bfda51" Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.416727 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665afc576532c123cb0175e2cc48a982b92ea927d0924539df7614bce7bfda51"} err="failed to get container status \"665afc576532c123cb0175e2cc48a982b92ea927d0924539df7614bce7bfda51\": rpc error: code = NotFound desc = could not find container \"665afc576532c123cb0175e2cc48a982b92ea927d0924539df7614bce7bfda51\": container with ID starting with 665afc576532c123cb0175e2cc48a982b92ea927d0924539df7614bce7bfda51 not found: ID does not exist" Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.416754 4792 scope.go:117] "RemoveContainer" containerID="ddee3a70d9eb655309e360d54909883cfd9d000c26f99891208c0b6f054976d1" Mar 09 10:07:39 crc kubenswrapper[4792]: E0309 10:07:39.419161 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddee3a70d9eb655309e360d54909883cfd9d000c26f99891208c0b6f054976d1\": container with ID starting with ddee3a70d9eb655309e360d54909883cfd9d000c26f99891208c0b6f054976d1 not found: ID does not exist" containerID="ddee3a70d9eb655309e360d54909883cfd9d000c26f99891208c0b6f054976d1" Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.419236 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddee3a70d9eb655309e360d54909883cfd9d000c26f99891208c0b6f054976d1"} err="failed to get container status \"ddee3a70d9eb655309e360d54909883cfd9d000c26f99891208c0b6f054976d1\": rpc error: code = NotFound desc = could not find container \"ddee3a70d9eb655309e360d54909883cfd9d000c26f99891208c0b6f054976d1\": container with ID starting with ddee3a70d9eb655309e360d54909883cfd9d000c26f99891208c0b6f054976d1 not found: ID does not exist" Mar 09 10:07:39 crc kubenswrapper[4792]: I0309 10:07:39.672979 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a7d0181-3a71-43d0-ba11-dddb2e9e9293" path="/var/lib/kubelet/pods/4a7d0181-3a71-43d0-ba11-dddb2e9e9293/volumes" Mar 09 10:07:43 crc kubenswrapper[4792]: I0309 10:07:43.214462 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:07:43 crc kubenswrapper[4792]: I0309 10:07:43.215097 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:07:43 crc kubenswrapper[4792]: I0309 10:07:43.215147 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 10:07:43 crc kubenswrapper[4792]: I0309 10:07:43.215913 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 10:07:43 crc kubenswrapper[4792]: I0309 10:07:43.215955 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" gracePeriod=600 Mar 09 10:07:43 crc kubenswrapper[4792]: E0309 10:07:43.340282 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:07:43 crc kubenswrapper[4792]: I0309 10:07:43.344358 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" exitCode=0 Mar 09 10:07:43 crc kubenswrapper[4792]: I0309 10:07:43.344406 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925"} Mar 09 10:07:43 crc kubenswrapper[4792]: I0309 10:07:43.344439 4792 scope.go:117] "RemoveContainer" containerID="8b8ee3a4ed67871065ec447eb157a9db2536cee10ff7911e7415e58bb9ff5c63" Mar 09 10:07:43 crc kubenswrapper[4792]: I0309 10:07:43.345053 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:07:43 crc kubenswrapper[4792]: E0309 10:07:43.345319 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:07:57 crc kubenswrapper[4792]: I0309 10:07:57.662157 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:07:57 crc kubenswrapper[4792]: E0309 10:07:57.662943 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.142042 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550848-nx7bn"] Mar 09 10:08:00 crc kubenswrapper[4792]: E0309 10:08:00.142883 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7d0181-3a71-43d0-ba11-dddb2e9e9293" containerName="registry-server" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.142900 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7d0181-3a71-43d0-ba11-dddb2e9e9293" containerName="registry-server" Mar 09 10:08:00 crc kubenswrapper[4792]: E0309 10:08:00.142929 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7d0181-3a71-43d0-ba11-dddb2e9e9293" containerName="extract-content" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.142937 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7d0181-3a71-43d0-ba11-dddb2e9e9293" containerName="extract-content" Mar 09 10:08:00 crc kubenswrapper[4792]: E0309 10:08:00.142970 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7d0181-3a71-43d0-ba11-dddb2e9e9293" containerName="extract-utilities" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.142978 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7d0181-3a71-43d0-ba11-dddb2e9e9293" containerName="extract-utilities" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.143246 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a7d0181-3a71-43d0-ba11-dddb2e9e9293" containerName="registry-server" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.144147 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550848-nx7bn" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.149921 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.150262 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.150576 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.153577 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550848-nx7bn"] Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.264711 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9sgf\" (UniqueName: \"kubernetes.io/projected/6da3a985-08b0-42d4-951c-1bd67f84eeba-kube-api-access-b9sgf\") pod \"auto-csr-approver-29550848-nx7bn\" (UID: \"6da3a985-08b0-42d4-951c-1bd67f84eeba\") " pod="openshift-infra/auto-csr-approver-29550848-nx7bn" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.366729 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9sgf\" (UniqueName: \"kubernetes.io/projected/6da3a985-08b0-42d4-951c-1bd67f84eeba-kube-api-access-b9sgf\") pod \"auto-csr-approver-29550848-nx7bn\" (UID: \"6da3a985-08b0-42d4-951c-1bd67f84eeba\") " pod="openshift-infra/auto-csr-approver-29550848-nx7bn" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.391536 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9sgf\" (UniqueName: \"kubernetes.io/projected/6da3a985-08b0-42d4-951c-1bd67f84eeba-kube-api-access-b9sgf\") pod \"auto-csr-approver-29550848-nx7bn\" (UID: \"6da3a985-08b0-42d4-951c-1bd67f84eeba\") " pod="openshift-infra/auto-csr-approver-29550848-nx7bn" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.467226 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550848-nx7bn" Mar 09 10:08:00 crc kubenswrapper[4792]: I0309 10:08:00.933832 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550848-nx7bn"] Mar 09 10:08:01 crc kubenswrapper[4792]: I0309 10:08:01.486720 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550848-nx7bn" event={"ID":"6da3a985-08b0-42d4-951c-1bd67f84eeba","Type":"ContainerStarted","Data":"a3a16b740bf0a9e75439772d0e055fbaf01bba9d8d4e1e295fd8e1850dcd6f33"} Mar 09 10:08:03 crc kubenswrapper[4792]: I0309 10:08:03.520096 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550848-nx7bn" event={"ID":"6da3a985-08b0-42d4-951c-1bd67f84eeba","Type":"ContainerDied","Data":"4e4639e390b3ed25c6460fbb38e3c3385eae2cf1212d91e0bfd4452f67a90de5"} Mar 09 10:08:03 crc kubenswrapper[4792]: I0309 10:08:03.519982 4792 generic.go:334] "Generic (PLEG): container finished" podID="6da3a985-08b0-42d4-951c-1bd67f84eeba" containerID="4e4639e390b3ed25c6460fbb38e3c3385eae2cf1212d91e0bfd4452f67a90de5" exitCode=0 Mar 09 10:08:04 crc kubenswrapper[4792]: I0309 10:08:04.897749 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550848-nx7bn" Mar 09 10:08:05 crc kubenswrapper[4792]: I0309 10:08:05.069615 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9sgf\" (UniqueName: \"kubernetes.io/projected/6da3a985-08b0-42d4-951c-1bd67f84eeba-kube-api-access-b9sgf\") pod \"6da3a985-08b0-42d4-951c-1bd67f84eeba\" (UID: \"6da3a985-08b0-42d4-951c-1bd67f84eeba\") " Mar 09 10:08:05 crc kubenswrapper[4792]: I0309 10:08:05.077011 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da3a985-08b0-42d4-951c-1bd67f84eeba-kube-api-access-b9sgf" (OuterVolumeSpecName: "kube-api-access-b9sgf") pod "6da3a985-08b0-42d4-951c-1bd67f84eeba" (UID: "6da3a985-08b0-42d4-951c-1bd67f84eeba"). InnerVolumeSpecName "kube-api-access-b9sgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:08:05 crc kubenswrapper[4792]: I0309 10:08:05.172594 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9sgf\" (UniqueName: \"kubernetes.io/projected/6da3a985-08b0-42d4-951c-1bd67f84eeba-kube-api-access-b9sgf\") on node \"crc\" DevicePath \"\"" Mar 09 10:08:05 crc kubenswrapper[4792]: I0309 10:08:05.542237 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550848-nx7bn" event={"ID":"6da3a985-08b0-42d4-951c-1bd67f84eeba","Type":"ContainerDied","Data":"a3a16b740bf0a9e75439772d0e055fbaf01bba9d8d4e1e295fd8e1850dcd6f33"} Mar 09 10:08:05 crc kubenswrapper[4792]: I0309 10:08:05.542282 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3a16b740bf0a9e75439772d0e055fbaf01bba9d8d4e1e295fd8e1850dcd6f33" Mar 09 10:08:05 crc kubenswrapper[4792]: I0309 10:08:05.542356 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550848-nx7bn" Mar 09 10:08:05 crc kubenswrapper[4792]: I0309 10:08:05.988895 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550842-n6dpq"] Mar 09 10:08:06 crc kubenswrapper[4792]: I0309 10:08:06.004144 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550842-n6dpq"] Mar 09 10:08:07 crc kubenswrapper[4792]: I0309 10:08:07.674597 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7abaead-f598-4f9a-87ab-4e8f02067d1f" path="/var/lib/kubelet/pods/b7abaead-f598-4f9a-87ab-4e8f02067d1f/volumes" Mar 09 10:08:09 crc kubenswrapper[4792]: I0309 10:08:09.664154 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:08:09 crc kubenswrapper[4792]: E0309 10:08:09.664661 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:08:21 crc kubenswrapper[4792]: I0309 10:08:21.662520 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:08:21 crc kubenswrapper[4792]: E0309 10:08:21.663532 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:08:33 crc kubenswrapper[4792]: I0309 10:08:33.662956 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:08:33 crc kubenswrapper[4792]: E0309 10:08:33.663728 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:08:43 crc kubenswrapper[4792]: I0309 10:08:43.489701 4792 scope.go:117] "RemoveContainer" containerID="6fcaec029c5c962fe9ba07f2b5dbfbd08998dd028a126b2486a64c30d95603f0" Mar 09 10:08:48 crc kubenswrapper[4792]: I0309 10:08:48.662219 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:08:48 crc kubenswrapper[4792]: E0309 10:08:48.663011 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:09:00 crc kubenswrapper[4792]: I0309 10:09:00.663668 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:09:00 crc kubenswrapper[4792]: E0309 10:09:00.664596 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:09:11 crc kubenswrapper[4792]: I0309 10:09:11.662228 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:09:11 crc kubenswrapper[4792]: E0309 10:09:11.663019 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:09:26 crc kubenswrapper[4792]: I0309 10:09:26.662832 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:09:26 crc kubenswrapper[4792]: E0309 10:09:26.663599 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:09:39 crc kubenswrapper[4792]: I0309 10:09:39.662337 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:09:39 crc kubenswrapper[4792]: E0309 10:09:39.663102 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:09:54 crc kubenswrapper[4792]: I0309 10:09:54.662684 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:09:54 crc kubenswrapper[4792]: E0309 10:09:54.663413 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.147454 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550850-bbk2v"] Mar 09 10:10:00 crc kubenswrapper[4792]: E0309 10:10:00.148142 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da3a985-08b0-42d4-951c-1bd67f84eeba" containerName="oc" Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.148155 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da3a985-08b0-42d4-951c-1bd67f84eeba" containerName="oc" Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.148362 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da3a985-08b0-42d4-951c-1bd67f84eeba" containerName="oc" Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.148956 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550850-bbk2v" Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.151842 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.151941 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.152198 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.201205 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550850-bbk2v"] Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.280838 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h55w\" (UniqueName: \"kubernetes.io/projected/f7e43865-caa1-4394-8217-253ef89ddbf1-kube-api-access-6h55w\") pod \"auto-csr-approver-29550850-bbk2v\" (UID: \"f7e43865-caa1-4394-8217-253ef89ddbf1\") " pod="openshift-infra/auto-csr-approver-29550850-bbk2v" Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.382900 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h55w\" (UniqueName: \"kubernetes.io/projected/f7e43865-caa1-4394-8217-253ef89ddbf1-kube-api-access-6h55w\") pod \"auto-csr-approver-29550850-bbk2v\" (UID: \"f7e43865-caa1-4394-8217-253ef89ddbf1\") " pod="openshift-infra/auto-csr-approver-29550850-bbk2v" Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.411957 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h55w\" (UniqueName: \"kubernetes.io/projected/f7e43865-caa1-4394-8217-253ef89ddbf1-kube-api-access-6h55w\") pod \"auto-csr-approver-29550850-bbk2v\" (UID: \"f7e43865-caa1-4394-8217-253ef89ddbf1\") " pod="openshift-infra/auto-csr-approver-29550850-bbk2v" Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.470879 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550850-bbk2v" Mar 09 10:10:00 crc kubenswrapper[4792]: W0309 10:10:00.936840 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7e43865_caa1_4394_8217_253ef89ddbf1.slice/crio-e6d727c368c252a394ab039816d54a07b4c0e31d9f07437fea04ae8e72adea53 WatchSource:0}: Error finding container e6d727c368c252a394ab039816d54a07b4c0e31d9f07437fea04ae8e72adea53: Status 404 returned error can't find the container with id e6d727c368c252a394ab039816d54a07b4c0e31d9f07437fea04ae8e72adea53 Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.940250 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 10:10:00 crc kubenswrapper[4792]: I0309 10:10:00.946569 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550850-bbk2v"] Mar 09 10:10:01 crc kubenswrapper[4792]: I0309 10:10:01.551875 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550850-bbk2v" event={"ID":"f7e43865-caa1-4394-8217-253ef89ddbf1","Type":"ContainerStarted","Data":"e6d727c368c252a394ab039816d54a07b4c0e31d9f07437fea04ae8e72adea53"} Mar 09 10:10:02 crc kubenswrapper[4792]: I0309 10:10:02.566494 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550850-bbk2v" event={"ID":"f7e43865-caa1-4394-8217-253ef89ddbf1","Type":"ContainerStarted","Data":"fc2863f2419a08d2abee982090a7895b41c17e6a5ec3de0f2808326f4aeedcd5"} Mar 09 10:10:02 crc kubenswrapper[4792]: I0309 10:10:02.589812 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550850-bbk2v" podStartSLOduration=1.370360087 podStartE2EDuration="2.589794844s" podCreationTimestamp="2026-03-09 10:10:00 +0000 UTC" firstStartedPulling="2026-03-09 10:10:00.939943267 +0000 UTC m=+3765.970144019" lastFinishedPulling="2026-03-09 10:10:02.159378024 +0000 UTC m=+3767.189578776" observedRunningTime="2026-03-09 10:10:02.581352258 +0000 UTC m=+3767.611553020" watchObservedRunningTime="2026-03-09 10:10:02.589794844 +0000 UTC m=+3767.619995596" Mar 09 10:10:03 crc kubenswrapper[4792]: I0309 10:10:03.576725 4792 generic.go:334] "Generic (PLEG): container finished" podID="f7e43865-caa1-4394-8217-253ef89ddbf1" containerID="fc2863f2419a08d2abee982090a7895b41c17e6a5ec3de0f2808326f4aeedcd5" exitCode=0 Mar 09 10:10:03 crc kubenswrapper[4792]: I0309 10:10:03.576777 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550850-bbk2v" event={"ID":"f7e43865-caa1-4394-8217-253ef89ddbf1","Type":"ContainerDied","Data":"fc2863f2419a08d2abee982090a7895b41c17e6a5ec3de0f2808326f4aeedcd5"} Mar 09 10:10:04 crc kubenswrapper[4792]: I0309 10:10:04.970623 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550850-bbk2v" Mar 09 10:10:05 crc kubenswrapper[4792]: I0309 10:10:05.095478 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h55w\" (UniqueName: \"kubernetes.io/projected/f7e43865-caa1-4394-8217-253ef89ddbf1-kube-api-access-6h55w\") pod \"f7e43865-caa1-4394-8217-253ef89ddbf1\" (UID: \"f7e43865-caa1-4394-8217-253ef89ddbf1\") " Mar 09 10:10:05 crc kubenswrapper[4792]: I0309 10:10:05.107455 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e43865-caa1-4394-8217-253ef89ddbf1-kube-api-access-6h55w" (OuterVolumeSpecName: "kube-api-access-6h55w") pod "f7e43865-caa1-4394-8217-253ef89ddbf1" (UID: "f7e43865-caa1-4394-8217-253ef89ddbf1"). InnerVolumeSpecName "kube-api-access-6h55w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:10:05 crc kubenswrapper[4792]: I0309 10:10:05.198532 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h55w\" (UniqueName: \"kubernetes.io/projected/f7e43865-caa1-4394-8217-253ef89ddbf1-kube-api-access-6h55w\") on node \"crc\" DevicePath \"\"" Mar 09 10:10:05 crc kubenswrapper[4792]: I0309 10:10:05.594854 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550850-bbk2v" event={"ID":"f7e43865-caa1-4394-8217-253ef89ddbf1","Type":"ContainerDied","Data":"e6d727c368c252a394ab039816d54a07b4c0e31d9f07437fea04ae8e72adea53"} Mar 09 10:10:05 crc kubenswrapper[4792]: I0309 10:10:05.595137 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6d727c368c252a394ab039816d54a07b4c0e31d9f07437fea04ae8e72adea53" Mar 09 10:10:05 crc kubenswrapper[4792]: I0309 10:10:05.595229 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550850-bbk2v" Mar 09 10:10:05 crc kubenswrapper[4792]: I0309 10:10:05.654540 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550844-jt4tj"] Mar 09 10:10:05 crc kubenswrapper[4792]: I0309 10:10:05.672418 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550844-jt4tj"] Mar 09 10:10:07 crc kubenswrapper[4792]: I0309 10:10:07.662688 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:10:07 crc kubenswrapper[4792]: E0309 10:10:07.664717 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:10:07 crc kubenswrapper[4792]: I0309 10:10:07.673584 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fdf0a54-c4a3-49fa-8556-1b426a7b5252" path="/var/lib/kubelet/pods/6fdf0a54-c4a3-49fa-8556-1b426a7b5252/volumes" Mar 09 10:10:22 crc kubenswrapper[4792]: I0309 10:10:22.662026 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:10:22 crc kubenswrapper[4792]: E0309 10:10:22.662938 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:10:34 crc kubenswrapper[4792]: I0309 10:10:34.661941 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:10:34 crc kubenswrapper[4792]: E0309 10:10:34.662769 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:10:43 crc kubenswrapper[4792]: I0309 10:10:43.609836 4792 scope.go:117] "RemoveContainer" containerID="750bf3fe741383fe544b752ba1ca9bb8340c5d3efa4e0e2206241a750b46a229" Mar 09 10:10:44 crc kubenswrapper[4792]: I0309 10:10:44.046712 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-57k9s"] Mar 09 10:10:44 crc kubenswrapper[4792]: I0309 10:10:44.054686 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-57k9s"] Mar 09 10:10:44 crc kubenswrapper[4792]: I0309 10:10:44.064163 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-291a-account-create-update-jcsfx"] Mar 09 10:10:44 crc kubenswrapper[4792]: I0309 10:10:44.072789 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-291a-account-create-update-jcsfx"] Mar 09 10:10:45 crc kubenswrapper[4792]: I0309 10:10:45.673462 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3823d2bd-f551-4107-a220-6e97655832e1" path="/var/lib/kubelet/pods/3823d2bd-f551-4107-a220-6e97655832e1/volumes" Mar 09 10:10:45 crc kubenswrapper[4792]: I0309 10:10:45.674666 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d8a53c4-679b-42ff-81a7-264b779e3486" path="/var/lib/kubelet/pods/6d8a53c4-679b-42ff-81a7-264b779e3486/volumes" Mar 09 10:10:47 crc kubenswrapper[4792]: I0309 10:10:47.663173 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:10:47 crc kubenswrapper[4792]: E0309 10:10:47.664022 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:11:02 crc kubenswrapper[4792]: I0309 10:11:02.662133 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:11:02 crc kubenswrapper[4792]: E0309 10:11:02.662904 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:11:17 crc kubenswrapper[4792]: I0309 10:11:17.662986 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:11:17 crc kubenswrapper[4792]: E0309 10:11:17.663865 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:11:24 crc kubenswrapper[4792]: I0309 10:11:24.040158 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-vll6l"] Mar 09 10:11:24 crc kubenswrapper[4792]: I0309 10:11:24.049100 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-vll6l"] Mar 09 10:11:25 crc kubenswrapper[4792]: I0309 10:11:25.676828 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6" path="/var/lib/kubelet/pods/8d57ab24-ecdd-4c3c-9f8e-c32b449f43c6/volumes" Mar 09 10:11:29 crc kubenswrapper[4792]: I0309 10:11:29.662459 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:11:29 crc kubenswrapper[4792]: E0309 10:11:29.663304 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:11:40 crc kubenswrapper[4792]: I0309 10:11:40.663411 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:11:40 crc kubenswrapper[4792]: E0309 10:11:40.664304 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:11:43 crc kubenswrapper[4792]: I0309 10:11:43.709334 4792 scope.go:117] "RemoveContainer" containerID="fcca30b03fa1385267a0cc8bc61bec74dd50819cddeafafab2c3e05fbf997918" Mar 09 10:11:43 crc kubenswrapper[4792]: I0309 10:11:43.735512 4792 scope.go:117] "RemoveContainer" containerID="8b10739ffff403a67411cdc03c4f0542e1ef6be44b8b7bd8574b945fc5079308" Mar 09 10:11:43 crc kubenswrapper[4792]: I0309 10:11:43.784668 4792 scope.go:117] "RemoveContainer" containerID="bbb5c1a656224ced1e1465fcf7fb0e58297f94be6c67b8e9c4f1a76ff4a25bbd" Mar 09 10:11:51 crc kubenswrapper[4792]: I0309 10:11:51.662671 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:11:51 crc kubenswrapper[4792]: E0309 10:11:51.664308 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:12:00 crc kubenswrapper[4792]: I0309 10:12:00.147012 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550852-2vflv"] Mar 09 10:12:00 crc kubenswrapper[4792]: E0309 10:12:00.149213 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e43865-caa1-4394-8217-253ef89ddbf1" containerName="oc" Mar 09 10:12:00 crc kubenswrapper[4792]: I0309 10:12:00.149343 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e43865-caa1-4394-8217-253ef89ddbf1" containerName="oc" Mar 09 10:12:00 crc kubenswrapper[4792]: I0309 10:12:00.149711 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e43865-caa1-4394-8217-253ef89ddbf1" containerName="oc" Mar 09 10:12:00 crc kubenswrapper[4792]: I0309 10:12:00.150618 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550852-2vflv" Mar 09 10:12:00 crc kubenswrapper[4792]: I0309 10:12:00.154142 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:12:00 crc kubenswrapper[4792]: I0309 10:12:00.154443 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:12:00 crc kubenswrapper[4792]: I0309 10:12:00.155986 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:12:00 crc kubenswrapper[4792]: I0309 10:12:00.164423 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550852-2vflv"] Mar 09 10:12:00 crc kubenswrapper[4792]: I0309 10:12:00.282108 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kf4k\" (UniqueName: \"kubernetes.io/projected/7afc0cfb-62db-43e9-b046-d712ef91a4fd-kube-api-access-7kf4k\") pod \"auto-csr-approver-29550852-2vflv\" (UID: \"7afc0cfb-62db-43e9-b046-d712ef91a4fd\") " pod="openshift-infra/auto-csr-approver-29550852-2vflv" Mar 09 10:12:00 crc kubenswrapper[4792]: I0309 10:12:00.384235 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kf4k\" (UniqueName: \"kubernetes.io/projected/7afc0cfb-62db-43e9-b046-d712ef91a4fd-kube-api-access-7kf4k\") pod \"auto-csr-approver-29550852-2vflv\" (UID: \"7afc0cfb-62db-43e9-b046-d712ef91a4fd\") " pod="openshift-infra/auto-csr-approver-29550852-2vflv" Mar 09 10:12:00 crc kubenswrapper[4792]: I0309 10:12:00.413589 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kf4k\" (UniqueName: \"kubernetes.io/projected/7afc0cfb-62db-43e9-b046-d712ef91a4fd-kube-api-access-7kf4k\") pod \"auto-csr-approver-29550852-2vflv\" (UID: \"7afc0cfb-62db-43e9-b046-d712ef91a4fd\") " pod="openshift-infra/auto-csr-approver-29550852-2vflv" Mar 09 10:12:00 crc kubenswrapper[4792]: I0309 10:12:00.474230 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550852-2vflv" Mar 09 10:12:00 crc kubenswrapper[4792]: I0309 10:12:00.966372 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550852-2vflv"] Mar 09 10:12:01 crc kubenswrapper[4792]: I0309 10:12:01.037513 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550852-2vflv" event={"ID":"7afc0cfb-62db-43e9-b046-d712ef91a4fd","Type":"ContainerStarted","Data":"fcf5e03a5055d6b27dce9aee11b3063dbfcde42d5f10cfd1bd2b99e3266e8a6e"} Mar 09 10:12:03 crc kubenswrapper[4792]: I0309 10:12:03.064570 4792 generic.go:334] "Generic (PLEG): container finished" podID="7afc0cfb-62db-43e9-b046-d712ef91a4fd" containerID="f09c17d8d7d1d24f13768107f8ab8c0aecfc3d7b9451ed7de38cf73353c33fe1" exitCode=0 Mar 09 10:12:03 crc kubenswrapper[4792]: I0309 10:12:03.064664 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550852-2vflv" event={"ID":"7afc0cfb-62db-43e9-b046-d712ef91a4fd","Type":"ContainerDied","Data":"f09c17d8d7d1d24f13768107f8ab8c0aecfc3d7b9451ed7de38cf73353c33fe1"} Mar 09 10:12:03 crc kubenswrapper[4792]: I0309 10:12:03.664926 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:12:03 crc kubenswrapper[4792]: E0309 10:12:03.665181 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:12:04 crc kubenswrapper[4792]: I0309 10:12:04.541681 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550852-2vflv" Mar 09 10:12:04 crc kubenswrapper[4792]: I0309 10:12:04.679283 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kf4k\" (UniqueName: \"kubernetes.io/projected/7afc0cfb-62db-43e9-b046-d712ef91a4fd-kube-api-access-7kf4k\") pod \"7afc0cfb-62db-43e9-b046-d712ef91a4fd\" (UID: \"7afc0cfb-62db-43e9-b046-d712ef91a4fd\") " Mar 09 10:12:04 crc kubenswrapper[4792]: I0309 10:12:04.688138 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7afc0cfb-62db-43e9-b046-d712ef91a4fd-kube-api-access-7kf4k" (OuterVolumeSpecName: "kube-api-access-7kf4k") pod "7afc0cfb-62db-43e9-b046-d712ef91a4fd" (UID: "7afc0cfb-62db-43e9-b046-d712ef91a4fd"). InnerVolumeSpecName "kube-api-access-7kf4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:12:04 crc kubenswrapper[4792]: I0309 10:12:04.782332 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kf4k\" (UniqueName: \"kubernetes.io/projected/7afc0cfb-62db-43e9-b046-d712ef91a4fd-kube-api-access-7kf4k\") on node \"crc\" DevicePath \"\"" Mar 09 10:12:05 crc kubenswrapper[4792]: I0309 10:12:05.082686 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550852-2vflv" event={"ID":"7afc0cfb-62db-43e9-b046-d712ef91a4fd","Type":"ContainerDied","Data":"fcf5e03a5055d6b27dce9aee11b3063dbfcde42d5f10cfd1bd2b99e3266e8a6e"} Mar 09 10:12:05 crc kubenswrapper[4792]: I0309 10:12:05.082723 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550852-2vflv" Mar 09 10:12:05 crc kubenswrapper[4792]: I0309 10:12:05.082727 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcf5e03a5055d6b27dce9aee11b3063dbfcde42d5f10cfd1bd2b99e3266e8a6e" Mar 09 10:12:05 crc kubenswrapper[4792]: I0309 10:12:05.644056 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550846-vdj9q"] Mar 09 10:12:05 crc kubenswrapper[4792]: I0309 10:12:05.660545 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550846-vdj9q"] Mar 09 10:12:05 crc kubenswrapper[4792]: I0309 10:12:05.674223 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6390ec2d-07a7-446b-ad3f-38a397d3a436" path="/var/lib/kubelet/pods/6390ec2d-07a7-446b-ad3f-38a397d3a436/volumes" Mar 09 10:12:17 crc kubenswrapper[4792]: I0309 10:12:17.666087 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:12:17 crc kubenswrapper[4792]: E0309 10:12:17.666934 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:12:31 crc kubenswrapper[4792]: I0309 10:12:31.662730 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:12:31 crc kubenswrapper[4792]: E0309 10:12:31.663695 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:12:42 crc kubenswrapper[4792]: I0309 10:12:42.663110 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:12:42 crc kubenswrapper[4792]: E0309 10:12:42.663907 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:12:43 crc kubenswrapper[4792]: I0309 10:12:43.892494 4792 scope.go:117] "RemoveContainer" containerID="398fe95a0a57611932be7a1180c1f118048ee29c9bbf357fb8488d8b8994830b" Mar 09 10:12:57 crc kubenswrapper[4792]: I0309 10:12:57.664572 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:12:58 crc kubenswrapper[4792]: I0309 10:12:58.548045 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"61823a1cefc0652bd0396fa1996ffc2e59e5ea4df20bbaa1b20c0c3f986988b7"} Mar 09 10:13:22 crc kubenswrapper[4792]: I0309 10:13:22.785919 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-df5b5"] Mar 09 10:13:22 crc kubenswrapper[4792]: E0309 10:13:22.787542 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7afc0cfb-62db-43e9-b046-d712ef91a4fd" containerName="oc" Mar 09 10:13:22 crc kubenswrapper[4792]: I0309 10:13:22.787559 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7afc0cfb-62db-43e9-b046-d712ef91a4fd" containerName="oc" Mar 09 10:13:22 crc kubenswrapper[4792]: I0309 10:13:22.787778 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7afc0cfb-62db-43e9-b046-d712ef91a4fd" containerName="oc" Mar 09 10:13:22 crc kubenswrapper[4792]: I0309 10:13:22.789432 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:13:22 crc kubenswrapper[4792]: I0309 10:13:22.809939 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-df5b5"] Mar 09 10:13:22 crc kubenswrapper[4792]: I0309 10:13:22.894132 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32b6d3e7-35bb-4197-9268-18b4a4e39098-utilities\") pod \"redhat-operators-df5b5\" (UID: \"32b6d3e7-35bb-4197-9268-18b4a4e39098\") " pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:13:22 crc kubenswrapper[4792]: I0309 10:13:22.894468 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2rb6\" (UniqueName: \"kubernetes.io/projected/32b6d3e7-35bb-4197-9268-18b4a4e39098-kube-api-access-f2rb6\") pod \"redhat-operators-df5b5\" (UID: \"32b6d3e7-35bb-4197-9268-18b4a4e39098\") " pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:13:22 crc kubenswrapper[4792]: I0309 10:13:22.894735 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32b6d3e7-35bb-4197-9268-18b4a4e39098-catalog-content\") pod \"redhat-operators-df5b5\" (UID: \"32b6d3e7-35bb-4197-9268-18b4a4e39098\") " pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:13:22 crc kubenswrapper[4792]: I0309 10:13:22.996622 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32b6d3e7-35bb-4197-9268-18b4a4e39098-utilities\") pod \"redhat-operators-df5b5\" (UID: \"32b6d3e7-35bb-4197-9268-18b4a4e39098\") " pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:13:22 crc kubenswrapper[4792]: I0309 10:13:22.996736 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2rb6\" (UniqueName: \"kubernetes.io/projected/32b6d3e7-35bb-4197-9268-18b4a4e39098-kube-api-access-f2rb6\") pod \"redhat-operators-df5b5\" (UID: \"32b6d3e7-35bb-4197-9268-18b4a4e39098\") " pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:13:22 crc kubenswrapper[4792]: I0309 10:13:22.996785 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32b6d3e7-35bb-4197-9268-18b4a4e39098-catalog-content\") pod \"redhat-operators-df5b5\" (UID: \"32b6d3e7-35bb-4197-9268-18b4a4e39098\") " pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:13:22 crc kubenswrapper[4792]: I0309 10:13:22.997390 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32b6d3e7-35bb-4197-9268-18b4a4e39098-catalog-content\") pod \"redhat-operators-df5b5\" (UID: \"32b6d3e7-35bb-4197-9268-18b4a4e39098\") " pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:13:22 crc kubenswrapper[4792]: I0309 10:13:22.997573 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32b6d3e7-35bb-4197-9268-18b4a4e39098-utilities\") pod \"redhat-operators-df5b5\" (UID: \"32b6d3e7-35bb-4197-9268-18b4a4e39098\") " pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:13:23 crc kubenswrapper[4792]: I0309 10:13:23.019918 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2rb6\" (UniqueName: \"kubernetes.io/projected/32b6d3e7-35bb-4197-9268-18b4a4e39098-kube-api-access-f2rb6\") pod \"redhat-operators-df5b5\" (UID: \"32b6d3e7-35bb-4197-9268-18b4a4e39098\") " pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:13:23 crc kubenswrapper[4792]: I0309 10:13:23.115153 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:13:23 crc kubenswrapper[4792]: I0309 10:13:23.616512 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-df5b5"] Mar 09 10:13:24 crc kubenswrapper[4792]: I0309 10:13:24.804017 4792 generic.go:334] "Generic (PLEG): container finished" podID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerID="77f0c55787fe37ae86ecc07cf7d08d7f76c8b63a7ac345ca10c039a47746cb0f" exitCode=0 Mar 09 10:13:24 crc kubenswrapper[4792]: I0309 10:13:24.804319 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df5b5" event={"ID":"32b6d3e7-35bb-4197-9268-18b4a4e39098","Type":"ContainerDied","Data":"77f0c55787fe37ae86ecc07cf7d08d7f76c8b63a7ac345ca10c039a47746cb0f"} Mar 09 10:13:24 crc kubenswrapper[4792]: I0309 10:13:24.804579 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df5b5" event={"ID":"32b6d3e7-35bb-4197-9268-18b4a4e39098","Type":"ContainerStarted","Data":"3e78b670e2e76e8b62244ae9abd8c7a7b9d78702ff09df1af3fee05a7b8fe684"} Mar 09 10:13:26 crc kubenswrapper[4792]: I0309 10:13:26.829659 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df5b5" event={"ID":"32b6d3e7-35bb-4197-9268-18b4a4e39098","Type":"ContainerStarted","Data":"73b9b1fada2a32e955d55560c71b05efac4fea1fdc7c73b71120015a19364000"} Mar 09 10:13:32 crc kubenswrapper[4792]: I0309 10:13:32.877663 4792 generic.go:334] "Generic (PLEG): container finished" podID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerID="73b9b1fada2a32e955d55560c71b05efac4fea1fdc7c73b71120015a19364000" exitCode=0 Mar 09 10:13:32 crc kubenswrapper[4792]: I0309 10:13:32.877819 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df5b5" event={"ID":"32b6d3e7-35bb-4197-9268-18b4a4e39098","Type":"ContainerDied","Data":"73b9b1fada2a32e955d55560c71b05efac4fea1fdc7c73b71120015a19364000"} Mar 09 10:13:34 crc kubenswrapper[4792]: I0309 10:13:34.896760 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df5b5" event={"ID":"32b6d3e7-35bb-4197-9268-18b4a4e39098","Type":"ContainerStarted","Data":"b8a4bdcc89a9ab6f51a0843b0304171857903f26523c6dcebb4b62ea9ce4e489"} Mar 09 10:13:34 crc kubenswrapper[4792]: I0309 10:13:34.922344 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-df5b5" podStartSLOduration=4.217702789 podStartE2EDuration="12.922322523s" podCreationTimestamp="2026-03-09 10:13:22 +0000 UTC" firstStartedPulling="2026-03-09 10:13:24.806002001 +0000 UTC m=+3969.836202753" lastFinishedPulling="2026-03-09 10:13:33.510621735 +0000 UTC m=+3978.540822487" observedRunningTime="2026-03-09 10:13:34.912239413 +0000 UTC m=+3979.942440175" watchObservedRunningTime="2026-03-09 10:13:34.922322523 +0000 UTC m=+3979.952523275" Mar 09 10:13:43 crc kubenswrapper[4792]: I0309 10:13:43.115758 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:13:43 crc kubenswrapper[4792]: I0309 10:13:43.116414 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:13:44 crc kubenswrapper[4792]: I0309 10:13:44.160179 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-df5b5" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="registry-server" probeResult="failure" output=< Mar 09 10:13:44 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:13:44 crc kubenswrapper[4792]: > Mar 09 10:13:54 crc kubenswrapper[4792]: I0309 10:13:54.163636 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-df5b5" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="registry-server" probeResult="failure" output=< Mar 09 10:13:54 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:13:54 crc kubenswrapper[4792]: > Mar 09 10:14:00 crc kubenswrapper[4792]: I0309 10:14:00.212298 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550854-4h7rq"] Mar 09 10:14:00 crc kubenswrapper[4792]: I0309 10:14:00.214423 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550854-4h7rq" Mar 09 10:14:00 crc kubenswrapper[4792]: I0309 10:14:00.216902 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:14:00 crc kubenswrapper[4792]: I0309 10:14:00.217383 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:14:00 crc kubenswrapper[4792]: I0309 10:14:00.217482 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:14:00 crc kubenswrapper[4792]: I0309 10:14:00.237396 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550854-4h7rq"] Mar 09 10:14:00 crc kubenswrapper[4792]: I0309 10:14:00.237833 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh96n\" (UniqueName: \"kubernetes.io/projected/919cdc0a-fbb2-4e1e-9849-501d2191f074-kube-api-access-bh96n\") pod \"auto-csr-approver-29550854-4h7rq\" (UID: \"919cdc0a-fbb2-4e1e-9849-501d2191f074\") " pod="openshift-infra/auto-csr-approver-29550854-4h7rq" Mar 09 10:14:00 crc kubenswrapper[4792]: I0309 10:14:00.339848 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh96n\" (UniqueName: \"kubernetes.io/projected/919cdc0a-fbb2-4e1e-9849-501d2191f074-kube-api-access-bh96n\") pod \"auto-csr-approver-29550854-4h7rq\" (UID: \"919cdc0a-fbb2-4e1e-9849-501d2191f074\") " pod="openshift-infra/auto-csr-approver-29550854-4h7rq" Mar 09 10:14:00 crc kubenswrapper[4792]: I0309 10:14:00.368546 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh96n\" (UniqueName: \"kubernetes.io/projected/919cdc0a-fbb2-4e1e-9849-501d2191f074-kube-api-access-bh96n\") pod \"auto-csr-approver-29550854-4h7rq\" (UID: \"919cdc0a-fbb2-4e1e-9849-501d2191f074\") " pod="openshift-infra/auto-csr-approver-29550854-4h7rq" Mar 09 10:14:00 crc kubenswrapper[4792]: I0309 10:14:00.533147 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550854-4h7rq" Mar 09 10:14:01 crc kubenswrapper[4792]: I0309 10:14:01.511141 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550854-4h7rq"] Mar 09 10:14:02 crc kubenswrapper[4792]: I0309 10:14:02.186275 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550854-4h7rq" event={"ID":"919cdc0a-fbb2-4e1e-9849-501d2191f074","Type":"ContainerStarted","Data":"6214b6e641aa6cdb6f966a5ec68b3659b893743e0a522815c6f5e67b6135f2f4"} Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.200936 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550854-4h7rq" event={"ID":"919cdc0a-fbb2-4e1e-9849-501d2191f074","Type":"ContainerStarted","Data":"d95914f38a9b5989724b30912d9d74e9e6221c985a6bd54510911176f25f7bf3"} Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.222945 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550854-4h7rq" podStartSLOduration=2.185108312 podStartE2EDuration="3.222925613s" podCreationTimestamp="2026-03-09 10:14:00 +0000 UTC" firstStartedPulling="2026-03-09 10:14:01.512383544 +0000 UTC m=+4006.542584296" lastFinishedPulling="2026-03-09 10:14:02.550200845 +0000 UTC m=+4007.580401597" observedRunningTime="2026-03-09 10:14:03.220818576 +0000 UTC m=+4008.251019328" watchObservedRunningTime="2026-03-09 10:14:03.222925613 +0000 UTC m=+4008.253126365" Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.539345 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-js5pp"] Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.553182 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.560623 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-js5pp"] Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.625400 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb9d2\" (UniqueName: \"kubernetes.io/projected/f1f94226-84d1-40a6-b197-ba54d857d72a-kube-api-access-bb9d2\") pod \"certified-operators-js5pp\" (UID: \"f1f94226-84d1-40a6-b197-ba54d857d72a\") " pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.625460 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f94226-84d1-40a6-b197-ba54d857d72a-utilities\") pod \"certified-operators-js5pp\" (UID: \"f1f94226-84d1-40a6-b197-ba54d857d72a\") " pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.625679 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f94226-84d1-40a6-b197-ba54d857d72a-catalog-content\") pod \"certified-operators-js5pp\" (UID: \"f1f94226-84d1-40a6-b197-ba54d857d72a\") " pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.728118 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f94226-84d1-40a6-b197-ba54d857d72a-catalog-content\") pod \"certified-operators-js5pp\" (UID: \"f1f94226-84d1-40a6-b197-ba54d857d72a\") " pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.728269 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb9d2\" (UniqueName: \"kubernetes.io/projected/f1f94226-84d1-40a6-b197-ba54d857d72a-kube-api-access-bb9d2\") pod \"certified-operators-js5pp\" (UID: \"f1f94226-84d1-40a6-b197-ba54d857d72a\") " pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.728293 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f94226-84d1-40a6-b197-ba54d857d72a-utilities\") pod \"certified-operators-js5pp\" (UID: \"f1f94226-84d1-40a6-b197-ba54d857d72a\") " pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.728928 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f94226-84d1-40a6-b197-ba54d857d72a-catalog-content\") pod \"certified-operators-js5pp\" (UID: \"f1f94226-84d1-40a6-b197-ba54d857d72a\") " pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.729596 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f94226-84d1-40a6-b197-ba54d857d72a-utilities\") pod \"certified-operators-js5pp\" (UID: \"f1f94226-84d1-40a6-b197-ba54d857d72a\") " pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.751937 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb9d2\" (UniqueName: \"kubernetes.io/projected/f1f94226-84d1-40a6-b197-ba54d857d72a-kube-api-access-bb9d2\") pod \"certified-operators-js5pp\" (UID: \"f1f94226-84d1-40a6-b197-ba54d857d72a\") " pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:03 crc kubenswrapper[4792]: I0309 10:14:03.896954 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:04 crc kubenswrapper[4792]: I0309 10:14:04.181455 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-df5b5" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="registry-server" probeResult="failure" output=< Mar 09 10:14:04 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:14:04 crc kubenswrapper[4792]: > Mar 09 10:14:04 crc kubenswrapper[4792]: I0309 10:14:04.426212 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-js5pp"] Mar 09 10:14:04 crc kubenswrapper[4792]: W0309 10:14:04.429424 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1f94226_84d1_40a6_b197_ba54d857d72a.slice/crio-18d049be75da1fb3e2d5cea88e354b28af0ec97557f7f54e8bfa2ac5a242421f WatchSource:0}: Error finding container 18d049be75da1fb3e2d5cea88e354b28af0ec97557f7f54e8bfa2ac5a242421f: Status 404 returned error can't find the container with id 18d049be75da1fb3e2d5cea88e354b28af0ec97557f7f54e8bfa2ac5a242421f Mar 09 10:14:05 crc kubenswrapper[4792]: I0309 10:14:05.221813 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1f94226-84d1-40a6-b197-ba54d857d72a" containerID="00d88216b67350b8919b90d4ef3dc5007f1c042418fb8903438275216d34d7e5" exitCode=0 Mar 09 10:14:05 crc kubenswrapper[4792]: I0309 10:14:05.221922 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js5pp" event={"ID":"f1f94226-84d1-40a6-b197-ba54d857d72a","Type":"ContainerDied","Data":"00d88216b67350b8919b90d4ef3dc5007f1c042418fb8903438275216d34d7e5"} Mar 09 10:14:05 crc kubenswrapper[4792]: I0309 10:14:05.222210 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js5pp" event={"ID":"f1f94226-84d1-40a6-b197-ba54d857d72a","Type":"ContainerStarted","Data":"18d049be75da1fb3e2d5cea88e354b28af0ec97557f7f54e8bfa2ac5a242421f"} Mar 09 10:14:05 crc kubenswrapper[4792]: I0309 10:14:05.225535 4792 generic.go:334] "Generic (PLEG): container finished" podID="919cdc0a-fbb2-4e1e-9849-501d2191f074" containerID="d95914f38a9b5989724b30912d9d74e9e6221c985a6bd54510911176f25f7bf3" exitCode=0 Mar 09 10:14:05 crc kubenswrapper[4792]: I0309 10:14:05.225585 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550854-4h7rq" event={"ID":"919cdc0a-fbb2-4e1e-9849-501d2191f074","Type":"ContainerDied","Data":"d95914f38a9b5989724b30912d9d74e9e6221c985a6bd54510911176f25f7bf3"} Mar 09 10:14:06 crc kubenswrapper[4792]: I0309 10:14:06.238162 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js5pp" event={"ID":"f1f94226-84d1-40a6-b197-ba54d857d72a","Type":"ContainerStarted","Data":"2f028cbed739d0b004f136a4501f8e8b4bdeffae74b4b53338ddc2b46c34079f"} Mar 09 10:14:06 crc kubenswrapper[4792]: I0309 10:14:06.677800 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550854-4h7rq" Mar 09 10:14:06 crc kubenswrapper[4792]: I0309 10:14:06.837578 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh96n\" (UniqueName: \"kubernetes.io/projected/919cdc0a-fbb2-4e1e-9849-501d2191f074-kube-api-access-bh96n\") pod \"919cdc0a-fbb2-4e1e-9849-501d2191f074\" (UID: \"919cdc0a-fbb2-4e1e-9849-501d2191f074\") " Mar 09 10:14:06 crc kubenswrapper[4792]: I0309 10:14:06.843748 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919cdc0a-fbb2-4e1e-9849-501d2191f074-kube-api-access-bh96n" (OuterVolumeSpecName: "kube-api-access-bh96n") pod "919cdc0a-fbb2-4e1e-9849-501d2191f074" (UID: "919cdc0a-fbb2-4e1e-9849-501d2191f074"). InnerVolumeSpecName "kube-api-access-bh96n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:14:06 crc kubenswrapper[4792]: I0309 10:14:06.940693 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh96n\" (UniqueName: \"kubernetes.io/projected/919cdc0a-fbb2-4e1e-9849-501d2191f074-kube-api-access-bh96n\") on node \"crc\" DevicePath \"\"" Mar 09 10:14:07 crc kubenswrapper[4792]: I0309 10:14:07.253547 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550854-4h7rq" Mar 09 10:14:07 crc kubenswrapper[4792]: I0309 10:14:07.253543 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550854-4h7rq" event={"ID":"919cdc0a-fbb2-4e1e-9849-501d2191f074","Type":"ContainerDied","Data":"6214b6e641aa6cdb6f966a5ec68b3659b893743e0a522815c6f5e67b6135f2f4"} Mar 09 10:14:07 crc kubenswrapper[4792]: I0309 10:14:07.257728 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6214b6e641aa6cdb6f966a5ec68b3659b893743e0a522815c6f5e67b6135f2f4" Mar 09 10:14:07 crc kubenswrapper[4792]: I0309 10:14:07.369665 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550848-nx7bn"] Mar 09 10:14:07 crc kubenswrapper[4792]: I0309 10:14:07.378295 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550848-nx7bn"] Mar 09 10:14:07 crc kubenswrapper[4792]: I0309 10:14:07.673886 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da3a985-08b0-42d4-951c-1bd67f84eeba" path="/var/lib/kubelet/pods/6da3a985-08b0-42d4-951c-1bd67f84eeba/volumes" Mar 09 10:14:09 crc kubenswrapper[4792]: I0309 10:14:09.274500 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1f94226-84d1-40a6-b197-ba54d857d72a" containerID="2f028cbed739d0b004f136a4501f8e8b4bdeffae74b4b53338ddc2b46c34079f" exitCode=0 Mar 09 10:14:09 crc kubenswrapper[4792]: I0309 10:14:09.274585 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js5pp" event={"ID":"f1f94226-84d1-40a6-b197-ba54d857d72a","Type":"ContainerDied","Data":"2f028cbed739d0b004f136a4501f8e8b4bdeffae74b4b53338ddc2b46c34079f"} Mar 09 10:14:10 crc kubenswrapper[4792]: I0309 10:14:10.286296 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js5pp" event={"ID":"f1f94226-84d1-40a6-b197-ba54d857d72a","Type":"ContainerStarted","Data":"85eff8b3fad9c032f49d0c3097c11d04e542042de18f9627b8a9dbed22e5a6e1"} Mar 09 10:14:10 crc kubenswrapper[4792]: I0309 10:14:10.313725 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-js5pp" podStartSLOduration=2.849788378 podStartE2EDuration="7.313703837s" podCreationTimestamp="2026-03-09 10:14:03 +0000 UTC" firstStartedPulling="2026-03-09 10:14:05.224962743 +0000 UTC m=+4010.255163495" lastFinishedPulling="2026-03-09 10:14:09.688878202 +0000 UTC m=+4014.719078954" observedRunningTime="2026-03-09 10:14:10.303519835 +0000 UTC m=+4015.333720597" watchObservedRunningTime="2026-03-09 10:14:10.313703837 +0000 UTC m=+4015.343904589" Mar 09 10:14:13 crc kubenswrapper[4792]: I0309 10:14:13.897817 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:13 crc kubenswrapper[4792]: I0309 10:14:13.899248 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:14 crc kubenswrapper[4792]: I0309 10:14:14.157631 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-df5b5" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="registry-server" probeResult="failure" output=< Mar 09 10:14:14 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:14:14 crc kubenswrapper[4792]: > Mar 09 10:14:14 crc kubenswrapper[4792]: I0309 10:14:14.948896 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-js5pp" podUID="f1f94226-84d1-40a6-b197-ba54d857d72a" containerName="registry-server" probeResult="failure" output=< Mar 09 10:14:14 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:14:14 crc kubenswrapper[4792]: > Mar 09 10:14:24 crc kubenswrapper[4792]: I0309 10:14:24.169211 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-df5b5" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="registry-server" probeResult="failure" output=< Mar 09 10:14:24 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:14:24 crc kubenswrapper[4792]: > Mar 09 10:14:24 crc kubenswrapper[4792]: I0309 10:14:24.952132 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-js5pp" podUID="f1f94226-84d1-40a6-b197-ba54d857d72a" containerName="registry-server" probeResult="failure" output=< Mar 09 10:14:24 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:14:24 crc kubenswrapper[4792]: > Mar 09 10:14:33 crc kubenswrapper[4792]: I0309 10:14:33.946035 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:33 crc kubenswrapper[4792]: I0309 10:14:33.996974 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:34 crc kubenswrapper[4792]: I0309 10:14:34.166091 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-df5b5" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="registry-server" probeResult="failure" output=< Mar 09 10:14:34 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:14:34 crc kubenswrapper[4792]: > Mar 09 10:14:34 crc kubenswrapper[4792]: I0309 10:14:34.744746 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-js5pp"] Mar 09 10:14:35 crc kubenswrapper[4792]: I0309 10:14:35.516758 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-js5pp" podUID="f1f94226-84d1-40a6-b197-ba54d857d72a" containerName="registry-server" containerID="cri-o://85eff8b3fad9c032f49d0c3097c11d04e542042de18f9627b8a9dbed22e5a6e1" gracePeriod=2 Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.290408 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.393603 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb9d2\" (UniqueName: \"kubernetes.io/projected/f1f94226-84d1-40a6-b197-ba54d857d72a-kube-api-access-bb9d2\") pod \"f1f94226-84d1-40a6-b197-ba54d857d72a\" (UID: \"f1f94226-84d1-40a6-b197-ba54d857d72a\") " Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.393999 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f94226-84d1-40a6-b197-ba54d857d72a-catalog-content\") pod \"f1f94226-84d1-40a6-b197-ba54d857d72a\" (UID: \"f1f94226-84d1-40a6-b197-ba54d857d72a\") " Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.394042 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f94226-84d1-40a6-b197-ba54d857d72a-utilities\") pod \"f1f94226-84d1-40a6-b197-ba54d857d72a\" (UID: \"f1f94226-84d1-40a6-b197-ba54d857d72a\") " Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.394586 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f94226-84d1-40a6-b197-ba54d857d72a-utilities" (OuterVolumeSpecName: "utilities") pod "f1f94226-84d1-40a6-b197-ba54d857d72a" (UID: "f1f94226-84d1-40a6-b197-ba54d857d72a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.395515 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1f94226-84d1-40a6-b197-ba54d857d72a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.415200 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f94226-84d1-40a6-b197-ba54d857d72a-kube-api-access-bb9d2" (OuterVolumeSpecName: "kube-api-access-bb9d2") pod "f1f94226-84d1-40a6-b197-ba54d857d72a" (UID: "f1f94226-84d1-40a6-b197-ba54d857d72a"). InnerVolumeSpecName "kube-api-access-bb9d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.474364 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f94226-84d1-40a6-b197-ba54d857d72a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1f94226-84d1-40a6-b197-ba54d857d72a" (UID: "f1f94226-84d1-40a6-b197-ba54d857d72a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.499318 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb9d2\" (UniqueName: \"kubernetes.io/projected/f1f94226-84d1-40a6-b197-ba54d857d72a-kube-api-access-bb9d2\") on node \"crc\" DevicePath \"\"" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.499373 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1f94226-84d1-40a6-b197-ba54d857d72a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.534966 4792 generic.go:334] "Generic (PLEG): container finished" podID="f1f94226-84d1-40a6-b197-ba54d857d72a" containerID="85eff8b3fad9c032f49d0c3097c11d04e542042de18f9627b8a9dbed22e5a6e1" exitCode=0 Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.535019 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js5pp" event={"ID":"f1f94226-84d1-40a6-b197-ba54d857d72a","Type":"ContainerDied","Data":"85eff8b3fad9c032f49d0c3097c11d04e542042de18f9627b8a9dbed22e5a6e1"} Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.535050 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js5pp" event={"ID":"f1f94226-84d1-40a6-b197-ba54d857d72a","Type":"ContainerDied","Data":"18d049be75da1fb3e2d5cea88e354b28af0ec97557f7f54e8bfa2ac5a242421f"} Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.535081 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js5pp" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.535128 4792 scope.go:117] "RemoveContainer" containerID="85eff8b3fad9c032f49d0c3097c11d04e542042de18f9627b8a9dbed22e5a6e1" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.594334 4792 scope.go:117] "RemoveContainer" containerID="2f028cbed739d0b004f136a4501f8e8b4bdeffae74b4b53338ddc2b46c34079f" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.596847 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-js5pp"] Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.606807 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-js5pp"] Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.632294 4792 scope.go:117] "RemoveContainer" containerID="00d88216b67350b8919b90d4ef3dc5007f1c042418fb8903438275216d34d7e5" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.680054 4792 scope.go:117] "RemoveContainer" containerID="85eff8b3fad9c032f49d0c3097c11d04e542042de18f9627b8a9dbed22e5a6e1" Mar 09 10:14:36 crc kubenswrapper[4792]: E0309 10:14:36.680701 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85eff8b3fad9c032f49d0c3097c11d04e542042de18f9627b8a9dbed22e5a6e1\": container with ID starting with 85eff8b3fad9c032f49d0c3097c11d04e542042de18f9627b8a9dbed22e5a6e1 not found: ID does not exist" containerID="85eff8b3fad9c032f49d0c3097c11d04e542042de18f9627b8a9dbed22e5a6e1" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.680741 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85eff8b3fad9c032f49d0c3097c11d04e542042de18f9627b8a9dbed22e5a6e1"} err="failed to get container status \"85eff8b3fad9c032f49d0c3097c11d04e542042de18f9627b8a9dbed22e5a6e1\": rpc error: code = NotFound desc = could not find container \"85eff8b3fad9c032f49d0c3097c11d04e542042de18f9627b8a9dbed22e5a6e1\": container with ID starting with 85eff8b3fad9c032f49d0c3097c11d04e542042de18f9627b8a9dbed22e5a6e1 not found: ID does not exist" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.680796 4792 scope.go:117] "RemoveContainer" containerID="2f028cbed739d0b004f136a4501f8e8b4bdeffae74b4b53338ddc2b46c34079f" Mar 09 10:14:36 crc kubenswrapper[4792]: E0309 10:14:36.681113 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f028cbed739d0b004f136a4501f8e8b4bdeffae74b4b53338ddc2b46c34079f\": container with ID starting with 2f028cbed739d0b004f136a4501f8e8b4bdeffae74b4b53338ddc2b46c34079f not found: ID does not exist" containerID="2f028cbed739d0b004f136a4501f8e8b4bdeffae74b4b53338ddc2b46c34079f" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.681147 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f028cbed739d0b004f136a4501f8e8b4bdeffae74b4b53338ddc2b46c34079f"} err="failed to get container status \"2f028cbed739d0b004f136a4501f8e8b4bdeffae74b4b53338ddc2b46c34079f\": rpc error: code = NotFound desc = could not find container \"2f028cbed739d0b004f136a4501f8e8b4bdeffae74b4b53338ddc2b46c34079f\": container with ID starting with 2f028cbed739d0b004f136a4501f8e8b4bdeffae74b4b53338ddc2b46c34079f not found: ID does not exist" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.681166 4792 scope.go:117] "RemoveContainer" containerID="00d88216b67350b8919b90d4ef3dc5007f1c042418fb8903438275216d34d7e5" Mar 09 10:14:36 crc kubenswrapper[4792]: E0309 10:14:36.681425 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00d88216b67350b8919b90d4ef3dc5007f1c042418fb8903438275216d34d7e5\": container with ID starting with 00d88216b67350b8919b90d4ef3dc5007f1c042418fb8903438275216d34d7e5 not found: ID does not exist" containerID="00d88216b67350b8919b90d4ef3dc5007f1c042418fb8903438275216d34d7e5" Mar 09 10:14:36 crc kubenswrapper[4792]: I0309 10:14:36.681451 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00d88216b67350b8919b90d4ef3dc5007f1c042418fb8903438275216d34d7e5"} err="failed to get container status \"00d88216b67350b8919b90d4ef3dc5007f1c042418fb8903438275216d34d7e5\": rpc error: code = NotFound desc = could not find container \"00d88216b67350b8919b90d4ef3dc5007f1c042418fb8903438275216d34d7e5\": container with ID starting with 00d88216b67350b8919b90d4ef3dc5007f1c042418fb8903438275216d34d7e5 not found: ID does not exist" Mar 09 10:14:37 crc kubenswrapper[4792]: I0309 10:14:37.676936 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f94226-84d1-40a6-b197-ba54d857d72a" path="/var/lib/kubelet/pods/f1f94226-84d1-40a6-b197-ba54d857d72a/volumes" Mar 09 10:14:43 crc kubenswrapper[4792]: I0309 10:14:43.170263 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:14:43 crc kubenswrapper[4792]: I0309 10:14:43.219626 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:14:43 crc kubenswrapper[4792]: I0309 10:14:43.415345 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-df5b5"] Mar 09 10:14:44 crc kubenswrapper[4792]: I0309 10:14:44.011242 4792 scope.go:117] "RemoveContainer" containerID="4e4639e390b3ed25c6460fbb38e3c3385eae2cf1212d91e0bfd4452f67a90de5" Mar 09 10:14:44 crc kubenswrapper[4792]: I0309 10:14:44.614497 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-df5b5" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="registry-server" containerID="cri-o://b8a4bdcc89a9ab6f51a0843b0304171857903f26523c6dcebb4b62ea9ce4e489" gracePeriod=2 Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.190723 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.284930 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32b6d3e7-35bb-4197-9268-18b4a4e39098-utilities\") pod \"32b6d3e7-35bb-4197-9268-18b4a4e39098\" (UID: \"32b6d3e7-35bb-4197-9268-18b4a4e39098\") " Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.285248 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2rb6\" (UniqueName: \"kubernetes.io/projected/32b6d3e7-35bb-4197-9268-18b4a4e39098-kube-api-access-f2rb6\") pod \"32b6d3e7-35bb-4197-9268-18b4a4e39098\" (UID: \"32b6d3e7-35bb-4197-9268-18b4a4e39098\") " Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.285324 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32b6d3e7-35bb-4197-9268-18b4a4e39098-catalog-content\") pod \"32b6d3e7-35bb-4197-9268-18b4a4e39098\" (UID: \"32b6d3e7-35bb-4197-9268-18b4a4e39098\") " Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.285612 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32b6d3e7-35bb-4197-9268-18b4a4e39098-utilities" (OuterVolumeSpecName: "utilities") pod "32b6d3e7-35bb-4197-9268-18b4a4e39098" (UID: "32b6d3e7-35bb-4197-9268-18b4a4e39098"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.285913 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32b6d3e7-35bb-4197-9268-18b4a4e39098-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.291359 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b6d3e7-35bb-4197-9268-18b4a4e39098-kube-api-access-f2rb6" (OuterVolumeSpecName: "kube-api-access-f2rb6") pod "32b6d3e7-35bb-4197-9268-18b4a4e39098" (UID: "32b6d3e7-35bb-4197-9268-18b4a4e39098"). InnerVolumeSpecName "kube-api-access-f2rb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.388762 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2rb6\" (UniqueName: \"kubernetes.io/projected/32b6d3e7-35bb-4197-9268-18b4a4e39098-kube-api-access-f2rb6\") on node \"crc\" DevicePath \"\"" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.425783 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32b6d3e7-35bb-4197-9268-18b4a4e39098-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32b6d3e7-35bb-4197-9268-18b4a4e39098" (UID: "32b6d3e7-35bb-4197-9268-18b4a4e39098"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.489836 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32b6d3e7-35bb-4197-9268-18b4a4e39098-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.625037 4792 generic.go:334] "Generic (PLEG): container finished" podID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerID="b8a4bdcc89a9ab6f51a0843b0304171857903f26523c6dcebb4b62ea9ce4e489" exitCode=0 Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.625124 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df5b5" event={"ID":"32b6d3e7-35bb-4197-9268-18b4a4e39098","Type":"ContainerDied","Data":"b8a4bdcc89a9ab6f51a0843b0304171857903f26523c6dcebb4b62ea9ce4e489"} Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.626168 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-df5b5" event={"ID":"32b6d3e7-35bb-4197-9268-18b4a4e39098","Type":"ContainerDied","Data":"3e78b670e2e76e8b62244ae9abd8c7a7b9d78702ff09df1af3fee05a7b8fe684"} Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.626196 4792 scope.go:117] "RemoveContainer" containerID="b8a4bdcc89a9ab6f51a0843b0304171857903f26523c6dcebb4b62ea9ce4e489" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.625170 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-df5b5" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.657339 4792 scope.go:117] "RemoveContainer" containerID="73b9b1fada2a32e955d55560c71b05efac4fea1fdc7c73b71120015a19364000" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.661277 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-df5b5"] Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.673503 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-df5b5"] Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.684043 4792 scope.go:117] "RemoveContainer" containerID="77f0c55787fe37ae86ecc07cf7d08d7f76c8b63a7ac345ca10c039a47746cb0f" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.727669 4792 scope.go:117] "RemoveContainer" containerID="b8a4bdcc89a9ab6f51a0843b0304171857903f26523c6dcebb4b62ea9ce4e489" Mar 09 10:14:45 crc kubenswrapper[4792]: E0309 10:14:45.728226 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8a4bdcc89a9ab6f51a0843b0304171857903f26523c6dcebb4b62ea9ce4e489\": container with ID starting with b8a4bdcc89a9ab6f51a0843b0304171857903f26523c6dcebb4b62ea9ce4e489 not found: ID does not exist" containerID="b8a4bdcc89a9ab6f51a0843b0304171857903f26523c6dcebb4b62ea9ce4e489" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.728281 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8a4bdcc89a9ab6f51a0843b0304171857903f26523c6dcebb4b62ea9ce4e489"} err="failed to get container status \"b8a4bdcc89a9ab6f51a0843b0304171857903f26523c6dcebb4b62ea9ce4e489\": rpc error: code = NotFound desc = could not find container \"b8a4bdcc89a9ab6f51a0843b0304171857903f26523c6dcebb4b62ea9ce4e489\": container with ID starting with b8a4bdcc89a9ab6f51a0843b0304171857903f26523c6dcebb4b62ea9ce4e489 not found: ID does not exist" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.728310 4792 scope.go:117] "RemoveContainer" containerID="73b9b1fada2a32e955d55560c71b05efac4fea1fdc7c73b71120015a19364000" Mar 09 10:14:45 crc kubenswrapper[4792]: E0309 10:14:45.728688 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73b9b1fada2a32e955d55560c71b05efac4fea1fdc7c73b71120015a19364000\": container with ID starting with 73b9b1fada2a32e955d55560c71b05efac4fea1fdc7c73b71120015a19364000 not found: ID does not exist" containerID="73b9b1fada2a32e955d55560c71b05efac4fea1fdc7c73b71120015a19364000" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.728735 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73b9b1fada2a32e955d55560c71b05efac4fea1fdc7c73b71120015a19364000"} err="failed to get container status \"73b9b1fada2a32e955d55560c71b05efac4fea1fdc7c73b71120015a19364000\": rpc error: code = NotFound desc = could not find container \"73b9b1fada2a32e955d55560c71b05efac4fea1fdc7c73b71120015a19364000\": container with ID starting with 73b9b1fada2a32e955d55560c71b05efac4fea1fdc7c73b71120015a19364000 not found: ID does not exist" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.728753 4792 scope.go:117] "RemoveContainer" containerID="77f0c55787fe37ae86ecc07cf7d08d7f76c8b63a7ac345ca10c039a47746cb0f" Mar 09 10:14:45 crc kubenswrapper[4792]: E0309 10:14:45.729133 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f0c55787fe37ae86ecc07cf7d08d7f76c8b63a7ac345ca10c039a47746cb0f\": container with ID starting with 77f0c55787fe37ae86ecc07cf7d08d7f76c8b63a7ac345ca10c039a47746cb0f not found: ID does not exist" containerID="77f0c55787fe37ae86ecc07cf7d08d7f76c8b63a7ac345ca10c039a47746cb0f" Mar 09 10:14:45 crc kubenswrapper[4792]: I0309 10:14:45.729171 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f0c55787fe37ae86ecc07cf7d08d7f76c8b63a7ac345ca10c039a47746cb0f"} err="failed to get container status \"77f0c55787fe37ae86ecc07cf7d08d7f76c8b63a7ac345ca10c039a47746cb0f\": rpc error: code = NotFound desc = could not find container \"77f0c55787fe37ae86ecc07cf7d08d7f76c8b63a7ac345ca10c039a47746cb0f\": container with ID starting with 77f0c55787fe37ae86ecc07cf7d08d7f76c8b63a7ac345ca10c039a47746cb0f not found: ID does not exist" Mar 09 10:14:47 crc kubenswrapper[4792]: I0309 10:14:47.673480 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" path="/var/lib/kubelet/pods/32b6d3e7-35bb-4197-9268-18b4a4e39098/volumes" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.158051 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9"] Mar 09 10:15:00 crc kubenswrapper[4792]: E0309 10:15:00.160423 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="extract-utilities" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.160545 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="extract-utilities" Mar 09 10:15:00 crc kubenswrapper[4792]: E0309 10:15:00.160628 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919cdc0a-fbb2-4e1e-9849-501d2191f074" containerName="oc" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.160699 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="919cdc0a-fbb2-4e1e-9849-501d2191f074" containerName="oc" Mar 09 10:15:00 crc kubenswrapper[4792]: E0309 10:15:00.160792 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="registry-server" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.160870 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="registry-server" Mar 09 10:15:00 crc kubenswrapper[4792]: E0309 10:15:00.160956 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f94226-84d1-40a6-b197-ba54d857d72a" containerName="extract-utilities" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.161023 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f94226-84d1-40a6-b197-ba54d857d72a" containerName="extract-utilities" Mar 09 10:15:00 crc kubenswrapper[4792]: E0309 10:15:00.161255 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f94226-84d1-40a6-b197-ba54d857d72a" containerName="extract-content" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.161337 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f94226-84d1-40a6-b197-ba54d857d72a" containerName="extract-content" Mar 09 10:15:00 crc kubenswrapper[4792]: E0309 10:15:00.161407 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f94226-84d1-40a6-b197-ba54d857d72a" containerName="registry-server" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.161462 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f94226-84d1-40a6-b197-ba54d857d72a" containerName="registry-server" Mar 09 10:15:00 crc kubenswrapper[4792]: E0309 10:15:00.161528 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="extract-content" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.161582 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="extract-content" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.161863 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f94226-84d1-40a6-b197-ba54d857d72a" containerName="registry-server" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.162010 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b6d3e7-35bb-4197-9268-18b4a4e39098" containerName="registry-server" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.162113 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="919cdc0a-fbb2-4e1e-9849-501d2191f074" containerName="oc" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.163676 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.167244 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.168036 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.176169 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9"] Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.196033 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c3d6218-d702-407f-bcb4-15f064a58aa2-config-volume\") pod \"collect-profiles-29550855-jc2j9\" (UID: \"2c3d6218-d702-407f-bcb4-15f064a58aa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.196276 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl96s\" (UniqueName: \"kubernetes.io/projected/2c3d6218-d702-407f-bcb4-15f064a58aa2-kube-api-access-dl96s\") pod \"collect-profiles-29550855-jc2j9\" (UID: \"2c3d6218-d702-407f-bcb4-15f064a58aa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.196318 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c3d6218-d702-407f-bcb4-15f064a58aa2-secret-volume\") pod \"collect-profiles-29550855-jc2j9\" (UID: \"2c3d6218-d702-407f-bcb4-15f064a58aa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.298240 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl96s\" (UniqueName: \"kubernetes.io/projected/2c3d6218-d702-407f-bcb4-15f064a58aa2-kube-api-access-dl96s\") pod \"collect-profiles-29550855-jc2j9\" (UID: \"2c3d6218-d702-407f-bcb4-15f064a58aa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.298331 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c3d6218-d702-407f-bcb4-15f064a58aa2-secret-volume\") pod \"collect-profiles-29550855-jc2j9\" (UID: \"2c3d6218-d702-407f-bcb4-15f064a58aa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.298398 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c3d6218-d702-407f-bcb4-15f064a58aa2-config-volume\") pod \"collect-profiles-29550855-jc2j9\" (UID: \"2c3d6218-d702-407f-bcb4-15f064a58aa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.299584 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c3d6218-d702-407f-bcb4-15f064a58aa2-config-volume\") pod \"collect-profiles-29550855-jc2j9\" (UID: \"2c3d6218-d702-407f-bcb4-15f064a58aa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.313640 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c3d6218-d702-407f-bcb4-15f064a58aa2-secret-volume\") pod \"collect-profiles-29550855-jc2j9\" (UID: \"2c3d6218-d702-407f-bcb4-15f064a58aa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.318210 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl96s\" (UniqueName: \"kubernetes.io/projected/2c3d6218-d702-407f-bcb4-15f064a58aa2-kube-api-access-dl96s\") pod \"collect-profiles-29550855-jc2j9\" (UID: \"2c3d6218-d702-407f-bcb4-15f064a58aa2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.496608 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" Mar 09 10:15:00 crc kubenswrapper[4792]: I0309 10:15:00.994539 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9"] Mar 09 10:15:01 crc kubenswrapper[4792]: I0309 10:15:01.778294 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" event={"ID":"2c3d6218-d702-407f-bcb4-15f064a58aa2","Type":"ContainerStarted","Data":"94df4364afeca509207d6a04cee55390e0731d2a5077b9bab1abfa57872746ff"} Mar 09 10:15:01 crc kubenswrapper[4792]: I0309 10:15:01.779837 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" event={"ID":"2c3d6218-d702-407f-bcb4-15f064a58aa2","Type":"ContainerStarted","Data":"ef707d16e518b7ea329b81dc014cc614980cafb9f5feb9cfa9108c7be19a12b6"} Mar 09 10:15:02 crc kubenswrapper[4792]: I0309 10:15:02.794223 4792 generic.go:334] "Generic (PLEG): container finished" podID="2c3d6218-d702-407f-bcb4-15f064a58aa2" containerID="94df4364afeca509207d6a04cee55390e0731d2a5077b9bab1abfa57872746ff" exitCode=0 Mar 09 10:15:02 crc kubenswrapper[4792]: I0309 10:15:02.794495 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" event={"ID":"2c3d6218-d702-407f-bcb4-15f064a58aa2","Type":"ContainerDied","Data":"94df4364afeca509207d6a04cee55390e0731d2a5077b9bab1abfa57872746ff"} Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.266900 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.377010 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl96s\" (UniqueName: \"kubernetes.io/projected/2c3d6218-d702-407f-bcb4-15f064a58aa2-kube-api-access-dl96s\") pod \"2c3d6218-d702-407f-bcb4-15f064a58aa2\" (UID: \"2c3d6218-d702-407f-bcb4-15f064a58aa2\") " Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.377380 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c3d6218-d702-407f-bcb4-15f064a58aa2-secret-volume\") pod \"2c3d6218-d702-407f-bcb4-15f064a58aa2\" (UID: \"2c3d6218-d702-407f-bcb4-15f064a58aa2\") " Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.377425 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c3d6218-d702-407f-bcb4-15f064a58aa2-config-volume\") pod \"2c3d6218-d702-407f-bcb4-15f064a58aa2\" (UID: \"2c3d6218-d702-407f-bcb4-15f064a58aa2\") " Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.378307 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c3d6218-d702-407f-bcb4-15f064a58aa2-config-volume" (OuterVolumeSpecName: "config-volume") pod "2c3d6218-d702-407f-bcb4-15f064a58aa2" (UID: "2c3d6218-d702-407f-bcb4-15f064a58aa2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.383318 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3d6218-d702-407f-bcb4-15f064a58aa2-kube-api-access-dl96s" (OuterVolumeSpecName: "kube-api-access-dl96s") pod "2c3d6218-d702-407f-bcb4-15f064a58aa2" (UID: "2c3d6218-d702-407f-bcb4-15f064a58aa2"). InnerVolumeSpecName "kube-api-access-dl96s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.384001 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3d6218-d702-407f-bcb4-15f064a58aa2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2c3d6218-d702-407f-bcb4-15f064a58aa2" (UID: "2c3d6218-d702-407f-bcb4-15f064a58aa2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.480338 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2c3d6218-d702-407f-bcb4-15f064a58aa2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.480377 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c3d6218-d702-407f-bcb4-15f064a58aa2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.480393 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl96s\" (UniqueName: \"kubernetes.io/projected/2c3d6218-d702-407f-bcb4-15f064a58aa2-kube-api-access-dl96s\") on node \"crc\" DevicePath \"\"" Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.815360 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" event={"ID":"2c3d6218-d702-407f-bcb4-15f064a58aa2","Type":"ContainerDied","Data":"ef707d16e518b7ea329b81dc014cc614980cafb9f5feb9cfa9108c7be19a12b6"} Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.815988 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef707d16e518b7ea329b81dc014cc614980cafb9f5feb9cfa9108c7be19a12b6" Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.815651 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-jc2j9" Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.889714 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75"] Mar 09 10:15:04 crc kubenswrapper[4792]: I0309 10:15:04.898874 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550810-2mq75"] Mar 09 10:15:05 crc kubenswrapper[4792]: I0309 10:15:05.684827 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9388ba35-9a13-455c-b8d3-1e19d6ed7c94" path="/var/lib/kubelet/pods/9388ba35-9a13-455c-b8d3-1e19d6ed7c94/volumes" Mar 09 10:15:13 crc kubenswrapper[4792]: I0309 10:15:13.214765 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:15:13 crc kubenswrapper[4792]: I0309 10:15:13.215437 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:15:43 crc kubenswrapper[4792]: I0309 10:15:43.214425 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:15:43 crc kubenswrapper[4792]: I0309 10:15:43.214897 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:15:44 crc kubenswrapper[4792]: I0309 10:15:44.126436 4792 scope.go:117] "RemoveContainer" containerID="1e3679387ae30ae01d8933c638bb854d979e89f76fd38bd91dec3c98af56717a" Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.155314 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550856-gt46k"] Mar 09 10:16:00 crc kubenswrapper[4792]: E0309 10:16:00.156235 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3d6218-d702-407f-bcb4-15f064a58aa2" containerName="collect-profiles" Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.156247 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3d6218-d702-407f-bcb4-15f064a58aa2" containerName="collect-profiles" Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.156436 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3d6218-d702-407f-bcb4-15f064a58aa2" containerName="collect-profiles" Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.157124 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550856-gt46k" Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.158959 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.159272 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.161617 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.169267 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550856-gt46k"] Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.329297 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9wmb\" (UniqueName: \"kubernetes.io/projected/6d6544d3-58e5-499b-bd88-e4a761d166a9-kube-api-access-m9wmb\") pod \"auto-csr-approver-29550856-gt46k\" (UID: \"6d6544d3-58e5-499b-bd88-e4a761d166a9\") " pod="openshift-infra/auto-csr-approver-29550856-gt46k" Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.431473 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9wmb\" (UniqueName: \"kubernetes.io/projected/6d6544d3-58e5-499b-bd88-e4a761d166a9-kube-api-access-m9wmb\") pod \"auto-csr-approver-29550856-gt46k\" (UID: \"6d6544d3-58e5-499b-bd88-e4a761d166a9\") " pod="openshift-infra/auto-csr-approver-29550856-gt46k" Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.460246 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9wmb\" (UniqueName: \"kubernetes.io/projected/6d6544d3-58e5-499b-bd88-e4a761d166a9-kube-api-access-m9wmb\") pod \"auto-csr-approver-29550856-gt46k\" (UID: \"6d6544d3-58e5-499b-bd88-e4a761d166a9\") " pod="openshift-infra/auto-csr-approver-29550856-gt46k" Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.476945 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550856-gt46k" Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.985105 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550856-gt46k"] Mar 09 10:16:00 crc kubenswrapper[4792]: I0309 10:16:00.988025 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 10:16:01 crc kubenswrapper[4792]: I0309 10:16:01.288254 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550856-gt46k" event={"ID":"6d6544d3-58e5-499b-bd88-e4a761d166a9","Type":"ContainerStarted","Data":"6057d8582be3f48af58750ef66ded6824db135757cab5d0b106e6dfa239721ad"} Mar 09 10:16:02 crc kubenswrapper[4792]: I0309 10:16:02.311261 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550856-gt46k" event={"ID":"6d6544d3-58e5-499b-bd88-e4a761d166a9","Type":"ContainerStarted","Data":"1076fd99612e0f315ccb8d8229e73fe18c5f4c323e9c2b4fa31c11b62715a7b8"} Mar 09 10:16:03 crc kubenswrapper[4792]: I0309 10:16:03.321801 4792 generic.go:334] "Generic (PLEG): container finished" podID="6d6544d3-58e5-499b-bd88-e4a761d166a9" containerID="1076fd99612e0f315ccb8d8229e73fe18c5f4c323e9c2b4fa31c11b62715a7b8" exitCode=0 Mar 09 10:16:03 crc kubenswrapper[4792]: I0309 10:16:03.322045 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550856-gt46k" event={"ID":"6d6544d3-58e5-499b-bd88-e4a761d166a9","Type":"ContainerDied","Data":"1076fd99612e0f315ccb8d8229e73fe18c5f4c323e9c2b4fa31c11b62715a7b8"} Mar 09 10:16:04 crc kubenswrapper[4792]: I0309 10:16:04.752888 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550856-gt46k" Mar 09 10:16:04 crc kubenswrapper[4792]: I0309 10:16:04.931330 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9wmb\" (UniqueName: \"kubernetes.io/projected/6d6544d3-58e5-499b-bd88-e4a761d166a9-kube-api-access-m9wmb\") pod \"6d6544d3-58e5-499b-bd88-e4a761d166a9\" (UID: \"6d6544d3-58e5-499b-bd88-e4a761d166a9\") " Mar 09 10:16:04 crc kubenswrapper[4792]: I0309 10:16:04.942962 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d6544d3-58e5-499b-bd88-e4a761d166a9-kube-api-access-m9wmb" (OuterVolumeSpecName: "kube-api-access-m9wmb") pod "6d6544d3-58e5-499b-bd88-e4a761d166a9" (UID: "6d6544d3-58e5-499b-bd88-e4a761d166a9"). InnerVolumeSpecName "kube-api-access-m9wmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:16:05 crc kubenswrapper[4792]: I0309 10:16:05.034536 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9wmb\" (UniqueName: \"kubernetes.io/projected/6d6544d3-58e5-499b-bd88-e4a761d166a9-kube-api-access-m9wmb\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:05 crc kubenswrapper[4792]: I0309 10:16:05.343275 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550856-gt46k" event={"ID":"6d6544d3-58e5-499b-bd88-e4a761d166a9","Type":"ContainerDied","Data":"6057d8582be3f48af58750ef66ded6824db135757cab5d0b106e6dfa239721ad"} Mar 09 10:16:05 crc kubenswrapper[4792]: I0309 10:16:05.343313 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6057d8582be3f48af58750ef66ded6824db135757cab5d0b106e6dfa239721ad" Mar 09 10:16:05 crc kubenswrapper[4792]: I0309 10:16:05.343366 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550856-gt46k" Mar 09 10:16:05 crc kubenswrapper[4792]: I0309 10:16:05.433321 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550850-bbk2v"] Mar 09 10:16:05 crc kubenswrapper[4792]: I0309 10:16:05.453648 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550850-bbk2v"] Mar 09 10:16:05 crc kubenswrapper[4792]: I0309 10:16:05.673166 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e43865-caa1-4394-8217-253ef89ddbf1" path="/var/lib/kubelet/pods/f7e43865-caa1-4394-8217-253ef89ddbf1/volumes" Mar 09 10:16:13 crc kubenswrapper[4792]: I0309 10:16:13.214567 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:16:13 crc kubenswrapper[4792]: I0309 10:16:13.216015 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:16:13 crc kubenswrapper[4792]: I0309 10:16:13.216154 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 10:16:13 crc kubenswrapper[4792]: I0309 10:16:13.216826 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61823a1cefc0652bd0396fa1996ffc2e59e5ea4df20bbaa1b20c0c3f986988b7"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 10:16:13 crc kubenswrapper[4792]: I0309 10:16:13.216961 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://61823a1cefc0652bd0396fa1996ffc2e59e5ea4df20bbaa1b20c0c3f986988b7" gracePeriod=600 Mar 09 10:16:13 crc kubenswrapper[4792]: I0309 10:16:13.408246 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="61823a1cefc0652bd0396fa1996ffc2e59e5ea4df20bbaa1b20c0c3f986988b7" exitCode=0 Mar 09 10:16:13 crc kubenswrapper[4792]: I0309 10:16:13.408337 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"61823a1cefc0652bd0396fa1996ffc2e59e5ea4df20bbaa1b20c0c3f986988b7"} Mar 09 10:16:13 crc kubenswrapper[4792]: I0309 10:16:13.408620 4792 scope.go:117] "RemoveContainer" containerID="27ebefbb6ee4b1e51f47c3813f2b791c0d513924dd39fe81cd2d9d65a223b925" Mar 09 10:16:14 crc kubenswrapper[4792]: I0309 10:16:14.420217 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2"} Mar 09 10:16:21 crc kubenswrapper[4792]: I0309 10:16:21.789058 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ks9xs"] Mar 09 10:16:21 crc kubenswrapper[4792]: E0309 10:16:21.789980 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d6544d3-58e5-499b-bd88-e4a761d166a9" containerName="oc" Mar 09 10:16:21 crc kubenswrapper[4792]: I0309 10:16:21.789998 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d6544d3-58e5-499b-bd88-e4a761d166a9" containerName="oc" Mar 09 10:16:21 crc kubenswrapper[4792]: I0309 10:16:21.790308 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d6544d3-58e5-499b-bd88-e4a761d166a9" containerName="oc" Mar 09 10:16:21 crc kubenswrapper[4792]: I0309 10:16:21.791629 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:21 crc kubenswrapper[4792]: I0309 10:16:21.814156 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ks9xs"] Mar 09 10:16:21 crc kubenswrapper[4792]: I0309 10:16:21.897482 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f866ac7-cc92-4520-9d5e-0147cac097f2-catalog-content\") pod \"community-operators-ks9xs\" (UID: \"4f866ac7-cc92-4520-9d5e-0147cac097f2\") " pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:21 crc kubenswrapper[4792]: I0309 10:16:21.897660 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f866ac7-cc92-4520-9d5e-0147cac097f2-utilities\") pod \"community-operators-ks9xs\" (UID: \"4f866ac7-cc92-4520-9d5e-0147cac097f2\") " pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:21 crc kubenswrapper[4792]: I0309 10:16:21.897697 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvbjz\" (UniqueName: \"kubernetes.io/projected/4f866ac7-cc92-4520-9d5e-0147cac097f2-kube-api-access-dvbjz\") pod \"community-operators-ks9xs\" (UID: \"4f866ac7-cc92-4520-9d5e-0147cac097f2\") " pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:22 crc kubenswrapper[4792]: I0309 10:16:21.999515 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f866ac7-cc92-4520-9d5e-0147cac097f2-utilities\") pod \"community-operators-ks9xs\" (UID: \"4f866ac7-cc92-4520-9d5e-0147cac097f2\") " pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:22 crc kubenswrapper[4792]: I0309 10:16:22.000100 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvbjz\" (UniqueName: \"kubernetes.io/projected/4f866ac7-cc92-4520-9d5e-0147cac097f2-kube-api-access-dvbjz\") pod \"community-operators-ks9xs\" (UID: \"4f866ac7-cc92-4520-9d5e-0147cac097f2\") " pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:22 crc kubenswrapper[4792]: I0309 10:16:22.000143 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f866ac7-cc92-4520-9d5e-0147cac097f2-utilities\") pod \"community-operators-ks9xs\" (UID: \"4f866ac7-cc92-4520-9d5e-0147cac097f2\") " pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:22 crc kubenswrapper[4792]: I0309 10:16:22.000196 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f866ac7-cc92-4520-9d5e-0147cac097f2-catalog-content\") pod \"community-operators-ks9xs\" (UID: \"4f866ac7-cc92-4520-9d5e-0147cac097f2\") " pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:22 crc kubenswrapper[4792]: I0309 10:16:22.000555 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f866ac7-cc92-4520-9d5e-0147cac097f2-catalog-content\") pod \"community-operators-ks9xs\" (UID: \"4f866ac7-cc92-4520-9d5e-0147cac097f2\") " pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:22 crc kubenswrapper[4792]: I0309 10:16:22.019595 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvbjz\" (UniqueName: \"kubernetes.io/projected/4f866ac7-cc92-4520-9d5e-0147cac097f2-kube-api-access-dvbjz\") pod \"community-operators-ks9xs\" (UID: \"4f866ac7-cc92-4520-9d5e-0147cac097f2\") " pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:22 crc kubenswrapper[4792]: I0309 10:16:22.112640 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:22 crc kubenswrapper[4792]: I0309 10:16:22.490102 4792 generic.go:334] "Generic (PLEG): container finished" podID="152f601f-0625-4503-a057-26316d8504aa" containerID="ccd41eb94e758842d138600a71c55ca2bee3a90aac5a0ed874f5d6a78dc3c9d7" exitCode=0 Mar 09 10:16:22 crc kubenswrapper[4792]: I0309 10:16:22.490183 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"152f601f-0625-4503-a057-26316d8504aa","Type":"ContainerDied","Data":"ccd41eb94e758842d138600a71c55ca2bee3a90aac5a0ed874f5d6a78dc3c9d7"} Mar 09 10:16:22 crc kubenswrapper[4792]: I0309 10:16:22.599186 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ks9xs"] Mar 09 10:16:23 crc kubenswrapper[4792]: I0309 10:16:23.499458 4792 generic.go:334] "Generic (PLEG): container finished" podID="4f866ac7-cc92-4520-9d5e-0147cac097f2" containerID="379b7b70dd3e8b9940b928dbf56c25ecf5d01dc80e765091f2440daf4ff88685" exitCode=0 Mar 09 10:16:23 crc kubenswrapper[4792]: I0309 10:16:23.499511 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks9xs" event={"ID":"4f866ac7-cc92-4520-9d5e-0147cac097f2","Type":"ContainerDied","Data":"379b7b70dd3e8b9940b928dbf56c25ecf5d01dc80e765091f2440daf4ff88685"} Mar 09 10:16:23 crc kubenswrapper[4792]: I0309 10:16:23.500152 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks9xs" event={"ID":"4f866ac7-cc92-4520-9d5e-0147cac097f2","Type":"ContainerStarted","Data":"2d6092fc1f7fe2a73cd5f1fe8414b4055dcba2b7d758f602adb2c871840befcf"} Mar 09 10:16:23 crc kubenswrapper[4792]: I0309 10:16:23.951500 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.050595 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-ca-certs\") pod \"152f601f-0625-4503-a057-26316d8504aa\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.050664 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-openstack-config-secret\") pod \"152f601f-0625-4503-a057-26316d8504aa\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.050694 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfc2t\" (UniqueName: \"kubernetes.io/projected/152f601f-0625-4503-a057-26316d8504aa-kube-api-access-qfc2t\") pod \"152f601f-0625-4503-a057-26316d8504aa\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.050719 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-ssh-key\") pod \"152f601f-0625-4503-a057-26316d8504aa\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.050752 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/152f601f-0625-4503-a057-26316d8504aa-openstack-config\") pod \"152f601f-0625-4503-a057-26316d8504aa\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.050782 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/152f601f-0625-4503-a057-26316d8504aa-test-operator-ephemeral-temporary\") pod \"152f601f-0625-4503-a057-26316d8504aa\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.050824 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/152f601f-0625-4503-a057-26316d8504aa-test-operator-ephemeral-workdir\") pod \"152f601f-0625-4503-a057-26316d8504aa\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.050874 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/152f601f-0625-4503-a057-26316d8504aa-config-data\") pod \"152f601f-0625-4503-a057-26316d8504aa\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.050948 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"152f601f-0625-4503-a057-26316d8504aa\" (UID: \"152f601f-0625-4503-a057-26316d8504aa\") " Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.053838 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/152f601f-0625-4503-a057-26316d8504aa-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "152f601f-0625-4503-a057-26316d8504aa" (UID: "152f601f-0625-4503-a057-26316d8504aa"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.054636 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/152f601f-0625-4503-a057-26316d8504aa-config-data" (OuterVolumeSpecName: "config-data") pod "152f601f-0625-4503-a057-26316d8504aa" (UID: "152f601f-0625-4503-a057-26316d8504aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.056900 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/152f601f-0625-4503-a057-26316d8504aa-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "152f601f-0625-4503-a057-26316d8504aa" (UID: "152f601f-0625-4503-a057-26316d8504aa"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.058064 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/152f601f-0625-4503-a057-26316d8504aa-kube-api-access-qfc2t" (OuterVolumeSpecName: "kube-api-access-qfc2t") pod "152f601f-0625-4503-a057-26316d8504aa" (UID: "152f601f-0625-4503-a057-26316d8504aa"). InnerVolumeSpecName "kube-api-access-qfc2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.061696 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "152f601f-0625-4503-a057-26316d8504aa" (UID: "152f601f-0625-4503-a057-26316d8504aa"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.085219 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "152f601f-0625-4503-a057-26316d8504aa" (UID: "152f601f-0625-4503-a057-26316d8504aa"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.087374 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "152f601f-0625-4503-a057-26316d8504aa" (UID: "152f601f-0625-4503-a057-26316d8504aa"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.110917 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "152f601f-0625-4503-a057-26316d8504aa" (UID: "152f601f-0625-4503-a057-26316d8504aa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.114820 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/152f601f-0625-4503-a057-26316d8504aa-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "152f601f-0625-4503-a057-26316d8504aa" (UID: "152f601f-0625-4503-a057-26316d8504aa"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.152830 4792 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.152870 4792 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.152897 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.152909 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfc2t\" (UniqueName: \"kubernetes.io/projected/152f601f-0625-4503-a057-26316d8504aa-kube-api-access-qfc2t\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.152922 4792 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/152f601f-0625-4503-a057-26316d8504aa-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.152933 4792 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/152f601f-0625-4503-a057-26316d8504aa-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.152944 4792 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/152f601f-0625-4503-a057-26316d8504aa-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.152955 4792 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/152f601f-0625-4503-a057-26316d8504aa-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.152967 4792 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/152f601f-0625-4503-a057-26316d8504aa-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.189974 4792 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.255216 4792 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.510029 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"152f601f-0625-4503-a057-26316d8504aa","Type":"ContainerDied","Data":"cc926f31dda71f4dc98e8de554cd3cc9d6f7431a9fd2b8604a45c704922cc63e"} Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.510125 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc926f31dda71f4dc98e8de554cd3cc9d6f7431a9fd2b8604a45c704922cc63e" Mar 09 10:16:24 crc kubenswrapper[4792]: I0309 10:16:24.510207 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 10:16:28 crc kubenswrapper[4792]: I0309 10:16:28.553367 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks9xs" event={"ID":"4f866ac7-cc92-4520-9d5e-0147cac097f2","Type":"ContainerStarted","Data":"66ab339c1e14f119961042950f99c28e72ddcd1977587cea97d4c8ad1e9ba402"} Mar 09 10:16:29 crc kubenswrapper[4792]: I0309 10:16:29.564903 4792 generic.go:334] "Generic (PLEG): container finished" podID="4f866ac7-cc92-4520-9d5e-0147cac097f2" containerID="66ab339c1e14f119961042950f99c28e72ddcd1977587cea97d4c8ad1e9ba402" exitCode=0 Mar 09 10:16:29 crc kubenswrapper[4792]: I0309 10:16:29.564951 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks9xs" event={"ID":"4f866ac7-cc92-4520-9d5e-0147cac097f2","Type":"ContainerDied","Data":"66ab339c1e14f119961042950f99c28e72ddcd1977587cea97d4c8ad1e9ba402"} Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.270884 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 10:16:30 crc kubenswrapper[4792]: E0309 10:16:30.271875 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152f601f-0625-4503-a057-26316d8504aa" containerName="tempest-tests-tempest-tests-runner" Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.271903 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="152f601f-0625-4503-a057-26316d8504aa" containerName="tempest-tests-tempest-tests-runner" Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.272238 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="152f601f-0625-4503-a057-26316d8504aa" containerName="tempest-tests-tempest-tests-runner" Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.273097 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.275541 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mgw7j" Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.284789 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.386813 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69889849-285a-4c47-a955-3e681990d59e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.386926 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mcpx\" (UniqueName: \"kubernetes.io/projected/69889849-285a-4c47-a955-3e681990d59e-kube-api-access-8mcpx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69889849-285a-4c47-a955-3e681990d59e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.489394 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69889849-285a-4c47-a955-3e681990d59e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.489560 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mcpx\" (UniqueName: \"kubernetes.io/projected/69889849-285a-4c47-a955-3e681990d59e-kube-api-access-8mcpx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69889849-285a-4c47-a955-3e681990d59e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.490443 4792 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69889849-285a-4c47-a955-3e681990d59e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.516994 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mcpx\" (UniqueName: \"kubernetes.io/projected/69889849-285a-4c47-a955-3e681990d59e-kube-api-access-8mcpx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69889849-285a-4c47-a955-3e681990d59e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.517703 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"69889849-285a-4c47-a955-3e681990d59e\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.575058 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ks9xs" event={"ID":"4f866ac7-cc92-4520-9d5e-0147cac097f2","Type":"ContainerStarted","Data":"5d6b732933016983da2a15184632dac910630e906a40dc18dd4f9d8f9ada4bc4"} Mar 09 10:16:30 crc kubenswrapper[4792]: I0309 10:16:30.607247 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:16:31 crc kubenswrapper[4792]: I0309 10:16:31.075263 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ks9xs" podStartSLOduration=3.578669652 podStartE2EDuration="10.075242481s" podCreationTimestamp="2026-03-09 10:16:21 +0000 UTC" firstStartedPulling="2026-03-09 10:16:23.501134488 +0000 UTC m=+4148.531335240" lastFinishedPulling="2026-03-09 10:16:29.997707317 +0000 UTC m=+4155.027908069" observedRunningTime="2026-03-09 10:16:30.607102429 +0000 UTC m=+4155.637303201" watchObservedRunningTime="2026-03-09 10:16:31.075242481 +0000 UTC m=+4156.105443233" Mar 09 10:16:31 crc kubenswrapper[4792]: I0309 10:16:31.093855 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 10:16:31 crc kubenswrapper[4792]: I0309 10:16:31.594349 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"69889849-285a-4c47-a955-3e681990d59e","Type":"ContainerStarted","Data":"19681d69ad65d45aaf743df6ac6f105f540632b58a832e70a888b3b1f4c871fd"} Mar 09 10:16:32 crc kubenswrapper[4792]: I0309 10:16:32.113043 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:32 crc kubenswrapper[4792]: I0309 10:16:32.113951 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:32 crc kubenswrapper[4792]: I0309 10:16:32.170551 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:32 crc kubenswrapper[4792]: I0309 10:16:32.603273 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"69889849-285a-4c47-a955-3e681990d59e","Type":"ContainerStarted","Data":"a2740f22f6fa355bf74b0f1cbba4372aad26fd51f9efb933295976f7f7936405"} Mar 09 10:16:32 crc kubenswrapper[4792]: I0309 10:16:32.618656 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.763391202 podStartE2EDuration="2.618641585s" podCreationTimestamp="2026-03-09 10:16:30 +0000 UTC" firstStartedPulling="2026-03-09 10:16:31.08455665 +0000 UTC m=+4156.114757402" lastFinishedPulling="2026-03-09 10:16:31.939807023 +0000 UTC m=+4156.970007785" observedRunningTime="2026-03-09 10:16:32.616517247 +0000 UTC m=+4157.646717989" watchObservedRunningTime="2026-03-09 10:16:32.618641585 +0000 UTC m=+4157.648842337" Mar 09 10:16:42 crc kubenswrapper[4792]: I0309 10:16:42.158360 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ks9xs" Mar 09 10:16:42 crc kubenswrapper[4792]: I0309 10:16:42.247990 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ks9xs"] Mar 09 10:16:42 crc kubenswrapper[4792]: I0309 10:16:42.293652 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qbwwg"] Mar 09 10:16:42 crc kubenswrapper[4792]: I0309 10:16:42.294253 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qbwwg" podUID="b34ef7c3-9371-4b4d-b917-e0ade6699991" containerName="registry-server" containerID="cri-o://a420b5f6f3275b3be694adc2d6a8ac0e48b9a2cc8571f912d2c7ab4722ffb6eb" gracePeriod=2 Mar 09 10:16:42 crc kubenswrapper[4792]: I0309 10:16:42.711912 4792 generic.go:334] "Generic (PLEG): container finished" podID="b34ef7c3-9371-4b4d-b917-e0ade6699991" containerID="a420b5f6f3275b3be694adc2d6a8ac0e48b9a2cc8571f912d2c7ab4722ffb6eb" exitCode=0 Mar 09 10:16:42 crc kubenswrapper[4792]: I0309 10:16:42.712398 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbwwg" event={"ID":"b34ef7c3-9371-4b4d-b917-e0ade6699991","Type":"ContainerDied","Data":"a420b5f6f3275b3be694adc2d6a8ac0e48b9a2cc8571f912d2c7ab4722ffb6eb"} Mar 09 10:16:42 crc kubenswrapper[4792]: I0309 10:16:42.896739 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbwwg" Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.051867 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b34ef7c3-9371-4b4d-b917-e0ade6699991-utilities\") pod \"b34ef7c3-9371-4b4d-b917-e0ade6699991\" (UID: \"b34ef7c3-9371-4b4d-b917-e0ade6699991\") " Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.051978 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbrzx\" (UniqueName: \"kubernetes.io/projected/b34ef7c3-9371-4b4d-b917-e0ade6699991-kube-api-access-jbrzx\") pod \"b34ef7c3-9371-4b4d-b917-e0ade6699991\" (UID: \"b34ef7c3-9371-4b4d-b917-e0ade6699991\") " Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.052020 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b34ef7c3-9371-4b4d-b917-e0ade6699991-catalog-content\") pod \"b34ef7c3-9371-4b4d-b917-e0ade6699991\" (UID: \"b34ef7c3-9371-4b4d-b917-e0ade6699991\") " Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.055029 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b34ef7c3-9371-4b4d-b917-e0ade6699991-utilities" (OuterVolumeSpecName: "utilities") pod "b34ef7c3-9371-4b4d-b917-e0ade6699991" (UID: "b34ef7c3-9371-4b4d-b917-e0ade6699991"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.071463 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b34ef7c3-9371-4b4d-b917-e0ade6699991-kube-api-access-jbrzx" (OuterVolumeSpecName: "kube-api-access-jbrzx") pod "b34ef7c3-9371-4b4d-b917-e0ade6699991" (UID: "b34ef7c3-9371-4b4d-b917-e0ade6699991"). InnerVolumeSpecName "kube-api-access-jbrzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.154092 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b34ef7c3-9371-4b4d-b917-e0ade6699991-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.154124 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbrzx\" (UniqueName: \"kubernetes.io/projected/b34ef7c3-9371-4b4d-b917-e0ade6699991-kube-api-access-jbrzx\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.267086 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b34ef7c3-9371-4b4d-b917-e0ade6699991-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b34ef7c3-9371-4b4d-b917-e0ade6699991" (UID: "b34ef7c3-9371-4b4d-b917-e0ade6699991"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.357925 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b34ef7c3-9371-4b4d-b917-e0ade6699991-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.723391 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbwwg" event={"ID":"b34ef7c3-9371-4b4d-b917-e0ade6699991","Type":"ContainerDied","Data":"f0b3c2fc883811420927b33e6e1fed4a2f76024c0acb52995e054ff1a1cb7189"} Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.723449 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbwwg" Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.723478 4792 scope.go:117] "RemoveContainer" containerID="a420b5f6f3275b3be694adc2d6a8ac0e48b9a2cc8571f912d2c7ab4722ffb6eb" Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.760030 4792 scope.go:117] "RemoveContainer" containerID="8e999fc597de5a5745e11b7c879956b243e91499e5210f9d4f13a46fd16a745c" Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.765366 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qbwwg"] Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.774061 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qbwwg"] Mar 09 10:16:43 crc kubenswrapper[4792]: I0309 10:16:43.787466 4792 scope.go:117] "RemoveContainer" containerID="be77829491867b5031a50f2721e96dd685d83b1fa711aa7d494f24a1848f0e4c" Mar 09 10:16:44 crc kubenswrapper[4792]: I0309 10:16:44.198914 4792 scope.go:117] "RemoveContainer" containerID="fc2863f2419a08d2abee982090a7895b41c17e6a5ec3de0f2808326f4aeedcd5" Mar 09 10:16:45 crc kubenswrapper[4792]: I0309 10:16:45.685802 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b34ef7c3-9371-4b4d-b917-e0ade6699991" path="/var/lib/kubelet/pods/b34ef7c3-9371-4b4d-b917-e0ade6699991/volumes" Mar 09 10:16:56 crc kubenswrapper[4792]: I0309 10:16:56.024928 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qmm66/must-gather-z6zwb"] Mar 09 10:16:56 crc kubenswrapper[4792]: E0309 10:16:56.025715 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34ef7c3-9371-4b4d-b917-e0ade6699991" containerName="extract-utilities" Mar 09 10:16:56 crc kubenswrapper[4792]: I0309 10:16:56.025726 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34ef7c3-9371-4b4d-b917-e0ade6699991" containerName="extract-utilities" Mar 09 10:16:56 crc kubenswrapper[4792]: E0309 10:16:56.025747 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34ef7c3-9371-4b4d-b917-e0ade6699991" containerName="extract-content" Mar 09 10:16:56 crc kubenswrapper[4792]: I0309 10:16:56.025753 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34ef7c3-9371-4b4d-b917-e0ade6699991" containerName="extract-content" Mar 09 10:16:56 crc kubenswrapper[4792]: E0309 10:16:56.025763 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34ef7c3-9371-4b4d-b917-e0ade6699991" containerName="registry-server" Mar 09 10:16:56 crc kubenswrapper[4792]: I0309 10:16:56.025769 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34ef7c3-9371-4b4d-b917-e0ade6699991" containerName="registry-server" Mar 09 10:16:56 crc kubenswrapper[4792]: I0309 10:16:56.025954 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="b34ef7c3-9371-4b4d-b917-e0ade6699991" containerName="registry-server" Mar 09 10:16:56 crc kubenswrapper[4792]: I0309 10:16:56.026874 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/must-gather-z6zwb" Mar 09 10:16:56 crc kubenswrapper[4792]: I0309 10:16:56.028455 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0b3707b4-8ae3-481c-9a67-45f24402244f-must-gather-output\") pod \"must-gather-z6zwb\" (UID: \"0b3707b4-8ae3-481c-9a67-45f24402244f\") " pod="openshift-must-gather-qmm66/must-gather-z6zwb" Mar 09 10:16:56 crc kubenswrapper[4792]: I0309 10:16:56.028573 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4knq7\" (UniqueName: \"kubernetes.io/projected/0b3707b4-8ae3-481c-9a67-45f24402244f-kube-api-access-4knq7\") pod \"must-gather-z6zwb\" (UID: \"0b3707b4-8ae3-481c-9a67-45f24402244f\") " pod="openshift-must-gather-qmm66/must-gather-z6zwb" Mar 09 10:16:56 crc kubenswrapper[4792]: W0309 10:16:56.034158 4792 reflector.go:561] object-"openshift-must-gather-qmm66"/"default-dockercfg-wjnhf": failed to list *v1.Secret: secrets "default-dockercfg-wjnhf" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-must-gather-qmm66": no relationship found between node 'crc' and this object Mar 09 10:16:56 crc kubenswrapper[4792]: E0309 10:16:56.034357 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-must-gather-qmm66\"/\"default-dockercfg-wjnhf\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-wjnhf\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-must-gather-qmm66\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 10:16:56 crc kubenswrapper[4792]: W0309 10:16:56.034169 4792 reflector.go:561] object-"openshift-must-gather-qmm66"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-must-gather-qmm66": no relationship found between node 'crc' and this object Mar 09 10:16:56 crc kubenswrapper[4792]: E0309 10:16:56.034520 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-must-gather-qmm66\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-must-gather-qmm66\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 10:16:56 crc kubenswrapper[4792]: W0309 10:16:56.035807 4792 reflector.go:561] object-"openshift-must-gather-qmm66"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-must-gather-qmm66": no relationship found between node 'crc' and this object Mar 09 10:16:56 crc kubenswrapper[4792]: E0309 10:16:56.035841 4792 reflector.go:158] "Unhandled Error" err="object-\"openshift-must-gather-qmm66\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-must-gather-qmm66\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 10:16:56 crc kubenswrapper[4792]: I0309 10:16:56.077091 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qmm66/must-gather-z6zwb"] Mar 09 10:16:56 crc kubenswrapper[4792]: I0309 10:16:56.130160 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0b3707b4-8ae3-481c-9a67-45f24402244f-must-gather-output\") pod \"must-gather-z6zwb\" (UID: \"0b3707b4-8ae3-481c-9a67-45f24402244f\") " pod="openshift-must-gather-qmm66/must-gather-z6zwb" Mar 09 10:16:56 crc kubenswrapper[4792]: I0309 10:16:56.130249 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4knq7\" (UniqueName: \"kubernetes.io/projected/0b3707b4-8ae3-481c-9a67-45f24402244f-kube-api-access-4knq7\") pod \"must-gather-z6zwb\" (UID: \"0b3707b4-8ae3-481c-9a67-45f24402244f\") " pod="openshift-must-gather-qmm66/must-gather-z6zwb" Mar 09 10:16:56 crc kubenswrapper[4792]: I0309 10:16:56.130875 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0b3707b4-8ae3-481c-9a67-45f24402244f-must-gather-output\") pod \"must-gather-z6zwb\" (UID: \"0b3707b4-8ae3-481c-9a67-45f24402244f\") " pod="openshift-must-gather-qmm66/must-gather-z6zwb" Mar 09 10:16:57 crc kubenswrapper[4792]: I0309 10:16:57.152840 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qmm66"/"kube-root-ca.crt" Mar 09 10:16:57 crc kubenswrapper[4792]: I0309 10:16:57.264755 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qmm66"/"default-dockercfg-wjnhf" Mar 09 10:16:57 crc kubenswrapper[4792]: I0309 10:16:57.531112 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qmm66"/"openshift-service-ca.crt" Mar 09 10:16:57 crc kubenswrapper[4792]: I0309 10:16:57.549482 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4knq7\" (UniqueName: \"kubernetes.io/projected/0b3707b4-8ae3-481c-9a67-45f24402244f-kube-api-access-4knq7\") pod \"must-gather-z6zwb\" (UID: \"0b3707b4-8ae3-481c-9a67-45f24402244f\") " pod="openshift-must-gather-qmm66/must-gather-z6zwb" Mar 09 10:16:57 crc kubenswrapper[4792]: I0309 10:16:57.845263 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/must-gather-z6zwb" Mar 09 10:16:58 crc kubenswrapper[4792]: I0309 10:16:58.334889 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qmm66/must-gather-z6zwb"] Mar 09 10:16:58 crc kubenswrapper[4792]: I0309 10:16:58.844347 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmm66/must-gather-z6zwb" event={"ID":"0b3707b4-8ae3-481c-9a67-45f24402244f","Type":"ContainerStarted","Data":"4c751057b430ed11b7187aa7d97df1ddd3f4acc207ecec8df8a4a6a61d56113b"} Mar 09 10:17:07 crc kubenswrapper[4792]: I0309 10:17:07.930771 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmm66/must-gather-z6zwb" event={"ID":"0b3707b4-8ae3-481c-9a67-45f24402244f","Type":"ContainerStarted","Data":"e37b7e52b94d24c0021aafab46dbee8905bfea81d15b7477b29b149aaacc12a0"} Mar 09 10:17:08 crc kubenswrapper[4792]: I0309 10:17:08.941103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmm66/must-gather-z6zwb" event={"ID":"0b3707b4-8ae3-481c-9a67-45f24402244f","Type":"ContainerStarted","Data":"779a897a2032f14fbe8de77778c56e342d4972365fce59925db9305dc238f0ac"} Mar 09 10:17:09 crc kubenswrapper[4792]: I0309 10:17:09.001755 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qmm66/must-gather-z6zwb" podStartSLOduration=4.977980307 podStartE2EDuration="14.001736895s" podCreationTimestamp="2026-03-09 10:16:55 +0000 UTC" firstStartedPulling="2026-03-09 10:16:58.35187574 +0000 UTC m=+4183.382076492" lastFinishedPulling="2026-03-09 10:17:07.375632338 +0000 UTC m=+4192.405833080" observedRunningTime="2026-03-09 10:17:08.995184201 +0000 UTC m=+4194.025384973" watchObservedRunningTime="2026-03-09 10:17:09.001736895 +0000 UTC m=+4194.031937647" Mar 09 10:17:14 crc kubenswrapper[4792]: I0309 10:17:14.887420 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qmm66/crc-debug-vsjl9"] Mar 09 10:17:14 crc kubenswrapper[4792]: I0309 10:17:14.889315 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/crc-debug-vsjl9" Mar 09 10:17:14 crc kubenswrapper[4792]: I0309 10:17:14.940339 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18d88320-a8f4-4d44-a2a7-8c192276c581-host\") pod \"crc-debug-vsjl9\" (UID: \"18d88320-a8f4-4d44-a2a7-8c192276c581\") " pod="openshift-must-gather-qmm66/crc-debug-vsjl9" Mar 09 10:17:14 crc kubenswrapper[4792]: I0309 10:17:14.940423 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pqwr\" (UniqueName: \"kubernetes.io/projected/18d88320-a8f4-4d44-a2a7-8c192276c581-kube-api-access-9pqwr\") pod \"crc-debug-vsjl9\" (UID: \"18d88320-a8f4-4d44-a2a7-8c192276c581\") " pod="openshift-must-gather-qmm66/crc-debug-vsjl9" Mar 09 10:17:15 crc kubenswrapper[4792]: I0309 10:17:15.042417 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18d88320-a8f4-4d44-a2a7-8c192276c581-host\") pod \"crc-debug-vsjl9\" (UID: \"18d88320-a8f4-4d44-a2a7-8c192276c581\") " pod="openshift-must-gather-qmm66/crc-debug-vsjl9" Mar 09 10:17:15 crc kubenswrapper[4792]: I0309 10:17:15.042532 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pqwr\" (UniqueName: \"kubernetes.io/projected/18d88320-a8f4-4d44-a2a7-8c192276c581-kube-api-access-9pqwr\") pod \"crc-debug-vsjl9\" (UID: \"18d88320-a8f4-4d44-a2a7-8c192276c581\") " pod="openshift-must-gather-qmm66/crc-debug-vsjl9" Mar 09 10:17:15 crc kubenswrapper[4792]: I0309 10:17:15.043719 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18d88320-a8f4-4d44-a2a7-8c192276c581-host\") pod \"crc-debug-vsjl9\" (UID: \"18d88320-a8f4-4d44-a2a7-8c192276c581\") " pod="openshift-must-gather-qmm66/crc-debug-vsjl9" Mar 09 10:17:15 crc kubenswrapper[4792]: I0309 10:17:15.069790 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pqwr\" (UniqueName: \"kubernetes.io/projected/18d88320-a8f4-4d44-a2a7-8c192276c581-kube-api-access-9pqwr\") pod \"crc-debug-vsjl9\" (UID: \"18d88320-a8f4-4d44-a2a7-8c192276c581\") " pod="openshift-must-gather-qmm66/crc-debug-vsjl9" Mar 09 10:17:15 crc kubenswrapper[4792]: I0309 10:17:15.211415 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/crc-debug-vsjl9" Mar 09 10:17:16 crc kubenswrapper[4792]: I0309 10:17:16.000207 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmm66/crc-debug-vsjl9" event={"ID":"18d88320-a8f4-4d44-a2a7-8c192276c581","Type":"ContainerStarted","Data":"a42b87c0c9b44f42356024166d834435690e880e0f5367440778b07a12c3d6a8"} Mar 09 10:17:28 crc kubenswrapper[4792]: I0309 10:17:28.144321 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmm66/crc-debug-vsjl9" event={"ID":"18d88320-a8f4-4d44-a2a7-8c192276c581","Type":"ContainerStarted","Data":"d13f7cebd9156705f54933def6682923998bc769753aff63fccbd36852f9fa63"} Mar 09 10:17:28 crc kubenswrapper[4792]: I0309 10:17:28.169994 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qmm66/crc-debug-vsjl9" podStartSLOduration=1.836531191 podStartE2EDuration="14.169975154s" podCreationTimestamp="2026-03-09 10:17:14 +0000 UTC" firstStartedPulling="2026-03-09 10:17:15.268000761 +0000 UTC m=+4200.298201513" lastFinishedPulling="2026-03-09 10:17:27.601444724 +0000 UTC m=+4212.631645476" observedRunningTime="2026-03-09 10:17:28.16603642 +0000 UTC m=+4213.196237172" watchObservedRunningTime="2026-03-09 10:17:28.169975154 +0000 UTC m=+4213.200175906" Mar 09 10:18:00 crc kubenswrapper[4792]: I0309 10:18:00.156352 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550858-zpmfc"] Mar 09 10:18:00 crc kubenswrapper[4792]: I0309 10:18:00.158219 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550858-zpmfc" Mar 09 10:18:00 crc kubenswrapper[4792]: I0309 10:18:00.160460 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:18:00 crc kubenswrapper[4792]: I0309 10:18:00.164045 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:18:00 crc kubenswrapper[4792]: I0309 10:18:00.173634 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550858-zpmfc"] Mar 09 10:18:00 crc kubenswrapper[4792]: I0309 10:18:00.176298 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:18:00 crc kubenswrapper[4792]: I0309 10:18:00.214594 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vslj6\" (UniqueName: \"kubernetes.io/projected/9c764a5a-2f88-47ce-8b71-6d9d64d3938e-kube-api-access-vslj6\") pod \"auto-csr-approver-29550858-zpmfc\" (UID: \"9c764a5a-2f88-47ce-8b71-6d9d64d3938e\") " pod="openshift-infra/auto-csr-approver-29550858-zpmfc" Mar 09 10:18:00 crc kubenswrapper[4792]: I0309 10:18:00.316960 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vslj6\" (UniqueName: \"kubernetes.io/projected/9c764a5a-2f88-47ce-8b71-6d9d64d3938e-kube-api-access-vslj6\") pod \"auto-csr-approver-29550858-zpmfc\" (UID: \"9c764a5a-2f88-47ce-8b71-6d9d64d3938e\") " pod="openshift-infra/auto-csr-approver-29550858-zpmfc" Mar 09 10:18:00 crc kubenswrapper[4792]: I0309 10:18:00.350199 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vslj6\" (UniqueName: \"kubernetes.io/projected/9c764a5a-2f88-47ce-8b71-6d9d64d3938e-kube-api-access-vslj6\") pod \"auto-csr-approver-29550858-zpmfc\" (UID: \"9c764a5a-2f88-47ce-8b71-6d9d64d3938e\") " pod="openshift-infra/auto-csr-approver-29550858-zpmfc" Mar 09 10:18:00 crc kubenswrapper[4792]: I0309 10:18:00.479655 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550858-zpmfc" Mar 09 10:18:01 crc kubenswrapper[4792]: I0309 10:18:01.383295 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550858-zpmfc"] Mar 09 10:18:01 crc kubenswrapper[4792]: I0309 10:18:01.448722 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550858-zpmfc" event={"ID":"9c764a5a-2f88-47ce-8b71-6d9d64d3938e","Type":"ContainerStarted","Data":"786b844bcc556bb091fe397bc940f925c070d56d1f171336998233abc5f4bec7"} Mar 09 10:18:03 crc kubenswrapper[4792]: I0309 10:18:03.467830 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550858-zpmfc" event={"ID":"9c764a5a-2f88-47ce-8b71-6d9d64d3938e","Type":"ContainerStarted","Data":"6b8a0fe162f772738010c7ea4c42f74a04db402148be803f3e5302e300eb15d3"} Mar 09 10:18:03 crc kubenswrapper[4792]: I0309 10:18:03.490597 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550858-zpmfc" podStartSLOduration=2.608946542 podStartE2EDuration="3.490574909s" podCreationTimestamp="2026-03-09 10:18:00 +0000 UTC" firstStartedPulling="2026-03-09 10:18:01.388011908 +0000 UTC m=+4246.418212650" lastFinishedPulling="2026-03-09 10:18:02.269640265 +0000 UTC m=+4247.299841017" observedRunningTime="2026-03-09 10:18:03.488417421 +0000 UTC m=+4248.518618183" watchObservedRunningTime="2026-03-09 10:18:03.490574909 +0000 UTC m=+4248.520775661" Mar 09 10:18:05 crc kubenswrapper[4792]: I0309 10:18:05.486632 4792 generic.go:334] "Generic (PLEG): container finished" podID="9c764a5a-2f88-47ce-8b71-6d9d64d3938e" containerID="6b8a0fe162f772738010c7ea4c42f74a04db402148be803f3e5302e300eb15d3" exitCode=0 Mar 09 10:18:05 crc kubenswrapper[4792]: I0309 10:18:05.486722 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550858-zpmfc" event={"ID":"9c764a5a-2f88-47ce-8b71-6d9d64d3938e","Type":"ContainerDied","Data":"6b8a0fe162f772738010c7ea4c42f74a04db402148be803f3e5302e300eb15d3"} Mar 09 10:18:06 crc kubenswrapper[4792]: I0309 10:18:06.952865 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550858-zpmfc" Mar 09 10:18:07 crc kubenswrapper[4792]: I0309 10:18:07.062936 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vslj6\" (UniqueName: \"kubernetes.io/projected/9c764a5a-2f88-47ce-8b71-6d9d64d3938e-kube-api-access-vslj6\") pod \"9c764a5a-2f88-47ce-8b71-6d9d64d3938e\" (UID: \"9c764a5a-2f88-47ce-8b71-6d9d64d3938e\") " Mar 09 10:18:07 crc kubenswrapper[4792]: I0309 10:18:07.070984 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c764a5a-2f88-47ce-8b71-6d9d64d3938e-kube-api-access-vslj6" (OuterVolumeSpecName: "kube-api-access-vslj6") pod "9c764a5a-2f88-47ce-8b71-6d9d64d3938e" (UID: "9c764a5a-2f88-47ce-8b71-6d9d64d3938e"). InnerVolumeSpecName "kube-api-access-vslj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:18:07 crc kubenswrapper[4792]: I0309 10:18:07.165399 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vslj6\" (UniqueName: \"kubernetes.io/projected/9c764a5a-2f88-47ce-8b71-6d9d64d3938e-kube-api-access-vslj6\") on node \"crc\" DevicePath \"\"" Mar 09 10:18:07 crc kubenswrapper[4792]: I0309 10:18:07.520421 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550858-zpmfc" event={"ID":"9c764a5a-2f88-47ce-8b71-6d9d64d3938e","Type":"ContainerDied","Data":"786b844bcc556bb091fe397bc940f925c070d56d1f171336998233abc5f4bec7"} Mar 09 10:18:07 crc kubenswrapper[4792]: I0309 10:18:07.520458 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="786b844bcc556bb091fe397bc940f925c070d56d1f171336998233abc5f4bec7" Mar 09 10:18:07 crc kubenswrapper[4792]: I0309 10:18:07.520516 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550858-zpmfc" Mar 09 10:18:07 crc kubenswrapper[4792]: I0309 10:18:07.579018 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550852-2vflv"] Mar 09 10:18:07 crc kubenswrapper[4792]: I0309 10:18:07.587874 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550852-2vflv"] Mar 09 10:18:07 crc kubenswrapper[4792]: I0309 10:18:07.678814 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7afc0cfb-62db-43e9-b046-d712ef91a4fd" path="/var/lib/kubelet/pods/7afc0cfb-62db-43e9-b046-d712ef91a4fd/volumes" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.098449 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6w28d"] Mar 09 10:18:08 crc kubenswrapper[4792]: E0309 10:18:08.099136 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c764a5a-2f88-47ce-8b71-6d9d64d3938e" containerName="oc" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.099153 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c764a5a-2f88-47ce-8b71-6d9d64d3938e" containerName="oc" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.099385 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c764a5a-2f88-47ce-8b71-6d9d64d3938e" containerName="oc" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.101182 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.110685 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6w28d"] Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.189565 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r589\" (UniqueName: \"kubernetes.io/projected/9afa966f-2aa0-4dc4-94f4-b33884c64c65-kube-api-access-9r589\") pod \"redhat-marketplace-6w28d\" (UID: \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\") " pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.189640 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9afa966f-2aa0-4dc4-94f4-b33884c64c65-catalog-content\") pod \"redhat-marketplace-6w28d\" (UID: \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\") " pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.189661 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9afa966f-2aa0-4dc4-94f4-b33884c64c65-utilities\") pod \"redhat-marketplace-6w28d\" (UID: \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\") " pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.291636 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r589\" (UniqueName: \"kubernetes.io/projected/9afa966f-2aa0-4dc4-94f4-b33884c64c65-kube-api-access-9r589\") pod \"redhat-marketplace-6w28d\" (UID: \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\") " pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.291722 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9afa966f-2aa0-4dc4-94f4-b33884c64c65-catalog-content\") pod \"redhat-marketplace-6w28d\" (UID: \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\") " pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.291745 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9afa966f-2aa0-4dc4-94f4-b33884c64c65-utilities\") pod \"redhat-marketplace-6w28d\" (UID: \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\") " pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.292418 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9afa966f-2aa0-4dc4-94f4-b33884c64c65-utilities\") pod \"redhat-marketplace-6w28d\" (UID: \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\") " pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.292515 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9afa966f-2aa0-4dc4-94f4-b33884c64c65-catalog-content\") pod \"redhat-marketplace-6w28d\" (UID: \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\") " pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.318974 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r589\" (UniqueName: \"kubernetes.io/projected/9afa966f-2aa0-4dc4-94f4-b33884c64c65-kube-api-access-9r589\") pod \"redhat-marketplace-6w28d\" (UID: \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\") " pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:08 crc kubenswrapper[4792]: I0309 10:18:08.420751 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:09 crc kubenswrapper[4792]: I0309 10:18:09.068719 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6w28d"] Mar 09 10:18:09 crc kubenswrapper[4792]: I0309 10:18:09.539624 4792 generic.go:334] "Generic (PLEG): container finished" podID="9afa966f-2aa0-4dc4-94f4-b33884c64c65" containerID="890c663ef81654e0c8835e88e65dda1315ef91e1db1e851434c844580c426e57" exitCode=0 Mar 09 10:18:09 crc kubenswrapper[4792]: I0309 10:18:09.539672 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6w28d" event={"ID":"9afa966f-2aa0-4dc4-94f4-b33884c64c65","Type":"ContainerDied","Data":"890c663ef81654e0c8835e88e65dda1315ef91e1db1e851434c844580c426e57"} Mar 09 10:18:09 crc kubenswrapper[4792]: I0309 10:18:09.539701 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6w28d" event={"ID":"9afa966f-2aa0-4dc4-94f4-b33884c64c65","Type":"ContainerStarted","Data":"7553ccb6465a3c9d261dcdb3dcc120a4cac6f38a0da12363604eca5534239729"} Mar 09 10:18:10 crc kubenswrapper[4792]: I0309 10:18:10.555295 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6w28d" event={"ID":"9afa966f-2aa0-4dc4-94f4-b33884c64c65","Type":"ContainerStarted","Data":"2b9b6c854be137dee13e21cb59d34a27d871d68423fefb1abd2e211d4cb6b143"} Mar 09 10:18:12 crc kubenswrapper[4792]: I0309 10:18:12.574211 4792 generic.go:334] "Generic (PLEG): container finished" podID="9afa966f-2aa0-4dc4-94f4-b33884c64c65" containerID="2b9b6c854be137dee13e21cb59d34a27d871d68423fefb1abd2e211d4cb6b143" exitCode=0 Mar 09 10:18:12 crc kubenswrapper[4792]: I0309 10:18:12.574366 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6w28d" event={"ID":"9afa966f-2aa0-4dc4-94f4-b33884c64c65","Type":"ContainerDied","Data":"2b9b6c854be137dee13e21cb59d34a27d871d68423fefb1abd2e211d4cb6b143"} Mar 09 10:18:13 crc kubenswrapper[4792]: I0309 10:18:13.213769 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:18:13 crc kubenswrapper[4792]: I0309 10:18:13.214085 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:18:13 crc kubenswrapper[4792]: I0309 10:18:13.587695 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6w28d" event={"ID":"9afa966f-2aa0-4dc4-94f4-b33884c64c65","Type":"ContainerStarted","Data":"1f4f1fdabb4c3fd711c2474338c019b13b05c2198250caf2190b0f7ccc148e10"} Mar 09 10:18:13 crc kubenswrapper[4792]: I0309 10:18:13.611842 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6w28d" podStartSLOduration=2.105291429 podStartE2EDuration="5.611825222s" podCreationTimestamp="2026-03-09 10:18:08 +0000 UTC" firstStartedPulling="2026-03-09 10:18:09.543596151 +0000 UTC m=+4254.573796903" lastFinishedPulling="2026-03-09 10:18:13.050129944 +0000 UTC m=+4258.080330696" observedRunningTime="2026-03-09 10:18:13.609964902 +0000 UTC m=+4258.640165664" watchObservedRunningTime="2026-03-09 10:18:13.611825222 +0000 UTC m=+4258.642025974" Mar 09 10:18:15 crc kubenswrapper[4792]: I0309 10:18:15.606962 4792 generic.go:334] "Generic (PLEG): container finished" podID="18d88320-a8f4-4d44-a2a7-8c192276c581" containerID="d13f7cebd9156705f54933def6682923998bc769753aff63fccbd36852f9fa63" exitCode=0 Mar 09 10:18:15 crc kubenswrapper[4792]: I0309 10:18:15.607045 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmm66/crc-debug-vsjl9" event={"ID":"18d88320-a8f4-4d44-a2a7-8c192276c581","Type":"ContainerDied","Data":"d13f7cebd9156705f54933def6682923998bc769753aff63fccbd36852f9fa63"} Mar 09 10:18:16 crc kubenswrapper[4792]: I0309 10:18:16.727913 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/crc-debug-vsjl9" Mar 09 10:18:16 crc kubenswrapper[4792]: I0309 10:18:16.768450 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qmm66/crc-debug-vsjl9"] Mar 09 10:18:16 crc kubenswrapper[4792]: I0309 10:18:16.778476 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qmm66/crc-debug-vsjl9"] Mar 09 10:18:16 crc kubenswrapper[4792]: I0309 10:18:16.914240 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18d88320-a8f4-4d44-a2a7-8c192276c581-host\") pod \"18d88320-a8f4-4d44-a2a7-8c192276c581\" (UID: \"18d88320-a8f4-4d44-a2a7-8c192276c581\") " Mar 09 10:18:16 crc kubenswrapper[4792]: I0309 10:18:16.914372 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18d88320-a8f4-4d44-a2a7-8c192276c581-host" (OuterVolumeSpecName: "host") pod "18d88320-a8f4-4d44-a2a7-8c192276c581" (UID: "18d88320-a8f4-4d44-a2a7-8c192276c581"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:18:16 crc kubenswrapper[4792]: I0309 10:18:16.914466 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pqwr\" (UniqueName: \"kubernetes.io/projected/18d88320-a8f4-4d44-a2a7-8c192276c581-kube-api-access-9pqwr\") pod \"18d88320-a8f4-4d44-a2a7-8c192276c581\" (UID: \"18d88320-a8f4-4d44-a2a7-8c192276c581\") " Mar 09 10:18:16 crc kubenswrapper[4792]: I0309 10:18:16.915001 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/18d88320-a8f4-4d44-a2a7-8c192276c581-host\") on node \"crc\" DevicePath \"\"" Mar 09 10:18:16 crc kubenswrapper[4792]: I0309 10:18:16.921356 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d88320-a8f4-4d44-a2a7-8c192276c581-kube-api-access-9pqwr" (OuterVolumeSpecName: "kube-api-access-9pqwr") pod "18d88320-a8f4-4d44-a2a7-8c192276c581" (UID: "18d88320-a8f4-4d44-a2a7-8c192276c581"). InnerVolumeSpecName "kube-api-access-9pqwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:18:17 crc kubenswrapper[4792]: I0309 10:18:17.016927 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pqwr\" (UniqueName: \"kubernetes.io/projected/18d88320-a8f4-4d44-a2a7-8c192276c581-kube-api-access-9pqwr\") on node \"crc\" DevicePath \"\"" Mar 09 10:18:17 crc kubenswrapper[4792]: I0309 10:18:17.635686 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a42b87c0c9b44f42356024166d834435690e880e0f5367440778b07a12c3d6a8" Mar 09 10:18:17 crc kubenswrapper[4792]: I0309 10:18:17.637244 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/crc-debug-vsjl9" Mar 09 10:18:17 crc kubenswrapper[4792]: I0309 10:18:17.673812 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18d88320-a8f4-4d44-a2a7-8c192276c581" path="/var/lib/kubelet/pods/18d88320-a8f4-4d44-a2a7-8c192276c581/volumes" Mar 09 10:18:17 crc kubenswrapper[4792]: E0309 10:18:17.826647 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18d88320_a8f4_4d44_a2a7_8c192276c581.slice/crio-a42b87c0c9b44f42356024166d834435690e880e0f5367440778b07a12c3d6a8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18d88320_a8f4_4d44_a2a7_8c192276c581.slice\": RecentStats: unable to find data in memory cache]" Mar 09 10:18:17 crc kubenswrapper[4792]: I0309 10:18:17.968820 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qmm66/crc-debug-d2lb6"] Mar 09 10:18:17 crc kubenswrapper[4792]: E0309 10:18:17.970640 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d88320-a8f4-4d44-a2a7-8c192276c581" containerName="container-00" Mar 09 10:18:17 crc kubenswrapper[4792]: I0309 10:18:17.970665 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d88320-a8f4-4d44-a2a7-8c192276c581" containerName="container-00" Mar 09 10:18:17 crc kubenswrapper[4792]: I0309 10:18:17.970880 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d88320-a8f4-4d44-a2a7-8c192276c581" containerName="container-00" Mar 09 10:18:17 crc kubenswrapper[4792]: I0309 10:18:17.971469 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/crc-debug-d2lb6" Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.136752 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3-host\") pod \"crc-debug-d2lb6\" (UID: \"feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3\") " pod="openshift-must-gather-qmm66/crc-debug-d2lb6" Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.136911 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpvjf\" (UniqueName: \"kubernetes.io/projected/feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3-kube-api-access-rpvjf\") pod \"crc-debug-d2lb6\" (UID: \"feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3\") " pod="openshift-must-gather-qmm66/crc-debug-d2lb6" Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.238983 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3-host\") pod \"crc-debug-d2lb6\" (UID: \"feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3\") " pod="openshift-must-gather-qmm66/crc-debug-d2lb6" Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.239142 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpvjf\" (UniqueName: \"kubernetes.io/projected/feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3-kube-api-access-rpvjf\") pod \"crc-debug-d2lb6\" (UID: \"feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3\") " pod="openshift-must-gather-qmm66/crc-debug-d2lb6" Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.239536 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3-host\") pod \"crc-debug-d2lb6\" (UID: \"feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3\") " pod="openshift-must-gather-qmm66/crc-debug-d2lb6" Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.264919 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpvjf\" (UniqueName: \"kubernetes.io/projected/feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3-kube-api-access-rpvjf\") pod \"crc-debug-d2lb6\" (UID: \"feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3\") " pod="openshift-must-gather-qmm66/crc-debug-d2lb6" Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.287529 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/crc-debug-d2lb6" Mar 09 10:18:18 crc kubenswrapper[4792]: W0309 10:18:18.324730 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeb3ae84_bed7_4c7a_b5fd_d86b6d6ad4e3.slice/crio-951af4e673c3fbc1d01e353e27b600f57bef370136e2b5bd33098253c64b7d58 WatchSource:0}: Error finding container 951af4e673c3fbc1d01e353e27b600f57bef370136e2b5bd33098253c64b7d58: Status 404 returned error can't find the container with id 951af4e673c3fbc1d01e353e27b600f57bef370136e2b5bd33098253c64b7d58 Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.422180 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.422271 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.478261 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.647833 4792 generic.go:334] "Generic (PLEG): container finished" podID="feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3" containerID="9273db4bc0db0626776b58e0891384eb7e49cad1442320d0413c757b8cf40e6a" exitCode=0 Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.647917 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmm66/crc-debug-d2lb6" event={"ID":"feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3","Type":"ContainerDied","Data":"9273db4bc0db0626776b58e0891384eb7e49cad1442320d0413c757b8cf40e6a"} Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.647962 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmm66/crc-debug-d2lb6" event={"ID":"feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3","Type":"ContainerStarted","Data":"951af4e673c3fbc1d01e353e27b600f57bef370136e2b5bd33098253c64b7d58"} Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.712794 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.765893 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6w28d"] Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.949930 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qmm66/crc-debug-d2lb6"] Mar 09 10:18:18 crc kubenswrapper[4792]: I0309 10:18:18.961889 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qmm66/crc-debug-d2lb6"] Mar 09 10:18:19 crc kubenswrapper[4792]: I0309 10:18:19.782728 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/crc-debug-d2lb6" Mar 09 10:18:19 crc kubenswrapper[4792]: I0309 10:18:19.881672 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpvjf\" (UniqueName: \"kubernetes.io/projected/feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3-kube-api-access-rpvjf\") pod \"feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3\" (UID: \"feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3\") " Mar 09 10:18:19 crc kubenswrapper[4792]: I0309 10:18:19.881807 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3-host\") pod \"feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3\" (UID: \"feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3\") " Mar 09 10:18:19 crc kubenswrapper[4792]: I0309 10:18:19.882408 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3-host" (OuterVolumeSpecName: "host") pod "feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3" (UID: "feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:18:19 crc kubenswrapper[4792]: I0309 10:18:19.888217 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3-kube-api-access-rpvjf" (OuterVolumeSpecName: "kube-api-access-rpvjf") pod "feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3" (UID: "feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3"). InnerVolumeSpecName "kube-api-access-rpvjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:18:19 crc kubenswrapper[4792]: I0309 10:18:19.984792 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3-host\") on node \"crc\" DevicePath \"\"" Mar 09 10:18:19 crc kubenswrapper[4792]: I0309 10:18:19.984834 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpvjf\" (UniqueName: \"kubernetes.io/projected/feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3-kube-api-access-rpvjf\") on node \"crc\" DevicePath \"\"" Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.184981 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qmm66/crc-debug-dfmsr"] Mar 09 10:18:20 crc kubenswrapper[4792]: E0309 10:18:20.185739 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3" containerName="container-00" Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.185834 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3" containerName="container-00" Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.186119 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3" containerName="container-00" Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.187927 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/crc-debug-dfmsr" Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.289385 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c35c7e5-6be1-4351-b3f4-032311a012ce-host\") pod \"crc-debug-dfmsr\" (UID: \"8c35c7e5-6be1-4351-b3f4-032311a012ce\") " pod="openshift-must-gather-qmm66/crc-debug-dfmsr" Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.289443 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8526d\" (UniqueName: \"kubernetes.io/projected/8c35c7e5-6be1-4351-b3f4-032311a012ce-kube-api-access-8526d\") pod \"crc-debug-dfmsr\" (UID: \"8c35c7e5-6be1-4351-b3f4-032311a012ce\") " pod="openshift-must-gather-qmm66/crc-debug-dfmsr" Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.390768 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c35c7e5-6be1-4351-b3f4-032311a012ce-host\") pod \"crc-debug-dfmsr\" (UID: \"8c35c7e5-6be1-4351-b3f4-032311a012ce\") " pod="openshift-must-gather-qmm66/crc-debug-dfmsr" Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.390821 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8526d\" (UniqueName: \"kubernetes.io/projected/8c35c7e5-6be1-4351-b3f4-032311a012ce-kube-api-access-8526d\") pod \"crc-debug-dfmsr\" (UID: \"8c35c7e5-6be1-4351-b3f4-032311a012ce\") " pod="openshift-must-gather-qmm66/crc-debug-dfmsr" Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.391367 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c35c7e5-6be1-4351-b3f4-032311a012ce-host\") pod \"crc-debug-dfmsr\" (UID: \"8c35c7e5-6be1-4351-b3f4-032311a012ce\") " pod="openshift-must-gather-qmm66/crc-debug-dfmsr" Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.408396 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8526d\" (UniqueName: \"kubernetes.io/projected/8c35c7e5-6be1-4351-b3f4-032311a012ce-kube-api-access-8526d\") pod \"crc-debug-dfmsr\" (UID: \"8c35c7e5-6be1-4351-b3f4-032311a012ce\") " pod="openshift-must-gather-qmm66/crc-debug-dfmsr" Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.506368 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/crc-debug-dfmsr" Mar 09 10:18:20 crc kubenswrapper[4792]: W0309 10:18:20.563992 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c35c7e5_6be1_4351_b3f4_032311a012ce.slice/crio-3669c2bd090fad3a9c064ef274c81a5c36a1a10da9012970f7de6e7628ce35dd WatchSource:0}: Error finding container 3669c2bd090fad3a9c064ef274c81a5c36a1a10da9012970f7de6e7628ce35dd: Status 404 returned error can't find the container with id 3669c2bd090fad3a9c064ef274c81a5c36a1a10da9012970f7de6e7628ce35dd Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.666159 4792 scope.go:117] "RemoveContainer" containerID="9273db4bc0db0626776b58e0891384eb7e49cad1442320d0413c757b8cf40e6a" Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.666172 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/crc-debug-d2lb6" Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.668244 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6w28d" podUID="9afa966f-2aa0-4dc4-94f4-b33884c64c65" containerName="registry-server" containerID="cri-o://1f4f1fdabb4c3fd711c2474338c019b13b05c2198250caf2190b0f7ccc148e10" gracePeriod=2 Mar 09 10:18:20 crc kubenswrapper[4792]: I0309 10:18:20.668520 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmm66/crc-debug-dfmsr" event={"ID":"8c35c7e5-6be1-4351-b3f4-032311a012ce","Type":"ContainerStarted","Data":"3669c2bd090fad3a9c064ef274c81a5c36a1a10da9012970f7de6e7628ce35dd"} Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.187120 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.310866 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r589\" (UniqueName: \"kubernetes.io/projected/9afa966f-2aa0-4dc4-94f4-b33884c64c65-kube-api-access-9r589\") pod \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\" (UID: \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\") " Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.310967 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9afa966f-2aa0-4dc4-94f4-b33884c64c65-utilities\") pod \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\" (UID: \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\") " Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.311235 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9afa966f-2aa0-4dc4-94f4-b33884c64c65-catalog-content\") pod \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\" (UID: \"9afa966f-2aa0-4dc4-94f4-b33884c64c65\") " Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.314850 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9afa966f-2aa0-4dc4-94f4-b33884c64c65-utilities" (OuterVolumeSpecName: "utilities") pod "9afa966f-2aa0-4dc4-94f4-b33884c64c65" (UID: "9afa966f-2aa0-4dc4-94f4-b33884c64c65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.353513 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9afa966f-2aa0-4dc4-94f4-b33884c64c65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9afa966f-2aa0-4dc4-94f4-b33884c64c65" (UID: "9afa966f-2aa0-4dc4-94f4-b33884c64c65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.413890 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9afa966f-2aa0-4dc4-94f4-b33884c64c65-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.413922 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9afa966f-2aa0-4dc4-94f4-b33884c64c65-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.675346 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3" path="/var/lib/kubelet/pods/feb3ae84-bed7-4c7a-b5fd-d86b6d6ad4e3/volumes" Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.677208 4792 generic.go:334] "Generic (PLEG): container finished" podID="8c35c7e5-6be1-4351-b3f4-032311a012ce" containerID="f1cab2ae304e3d85093a845385d8f05b5eefe763ab956c4534c4f306dba3b2cb" exitCode=0 Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.680358 4792 generic.go:334] "Generic (PLEG): container finished" podID="9afa966f-2aa0-4dc4-94f4-b33884c64c65" containerID="1f4f1fdabb4c3fd711c2474338c019b13b05c2198250caf2190b0f7ccc148e10" exitCode=0 Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.680457 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6w28d" Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.681178 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmm66/crc-debug-dfmsr" event={"ID":"8c35c7e5-6be1-4351-b3f4-032311a012ce","Type":"ContainerDied","Data":"f1cab2ae304e3d85093a845385d8f05b5eefe763ab956c4534c4f306dba3b2cb"} Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.681293 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6w28d" event={"ID":"9afa966f-2aa0-4dc4-94f4-b33884c64c65","Type":"ContainerDied","Data":"1f4f1fdabb4c3fd711c2474338c019b13b05c2198250caf2190b0f7ccc148e10"} Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.681378 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6w28d" event={"ID":"9afa966f-2aa0-4dc4-94f4-b33884c64c65","Type":"ContainerDied","Data":"7553ccb6465a3c9d261dcdb3dcc120a4cac6f38a0da12363604eca5534239729"} Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.681448 4792 scope.go:117] "RemoveContainer" containerID="1f4f1fdabb4c3fd711c2474338c019b13b05c2198250caf2190b0f7ccc148e10" Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.704816 4792 scope.go:117] "RemoveContainer" containerID="2b9b6c854be137dee13e21cb59d34a27d871d68423fefb1abd2e211d4cb6b143" Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.727517 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qmm66/crc-debug-dfmsr"] Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.736457 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qmm66/crc-debug-dfmsr"] Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.942347 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9afa966f-2aa0-4dc4-94f4-b33884c64c65-kube-api-access-9r589" (OuterVolumeSpecName: "kube-api-access-9r589") pod "9afa966f-2aa0-4dc4-94f4-b33884c64c65" (UID: "9afa966f-2aa0-4dc4-94f4-b33884c64c65"). InnerVolumeSpecName "kube-api-access-9r589". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:18:21 crc kubenswrapper[4792]: I0309 10:18:21.955460 4792 scope.go:117] "RemoveContainer" containerID="890c663ef81654e0c8835e88e65dda1315ef91e1db1e851434c844580c426e57" Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.026085 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r589\" (UniqueName: \"kubernetes.io/projected/9afa966f-2aa0-4dc4-94f4-b33884c64c65-kube-api-access-9r589\") on node \"crc\" DevicePath \"\"" Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.053642 4792 scope.go:117] "RemoveContainer" containerID="1f4f1fdabb4c3fd711c2474338c019b13b05c2198250caf2190b0f7ccc148e10" Mar 09 10:18:22 crc kubenswrapper[4792]: E0309 10:18:22.054527 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4f1fdabb4c3fd711c2474338c019b13b05c2198250caf2190b0f7ccc148e10\": container with ID starting with 1f4f1fdabb4c3fd711c2474338c019b13b05c2198250caf2190b0f7ccc148e10 not found: ID does not exist" containerID="1f4f1fdabb4c3fd711c2474338c019b13b05c2198250caf2190b0f7ccc148e10" Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.054558 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4f1fdabb4c3fd711c2474338c019b13b05c2198250caf2190b0f7ccc148e10"} err="failed to get container status \"1f4f1fdabb4c3fd711c2474338c019b13b05c2198250caf2190b0f7ccc148e10\": rpc error: code = NotFound desc = could not find container \"1f4f1fdabb4c3fd711c2474338c019b13b05c2198250caf2190b0f7ccc148e10\": container with ID starting with 1f4f1fdabb4c3fd711c2474338c019b13b05c2198250caf2190b0f7ccc148e10 not found: ID does not exist" Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.054578 4792 scope.go:117] "RemoveContainer" containerID="2b9b6c854be137dee13e21cb59d34a27d871d68423fefb1abd2e211d4cb6b143" Mar 09 10:18:22 crc kubenswrapper[4792]: E0309 10:18:22.054766 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9b6c854be137dee13e21cb59d34a27d871d68423fefb1abd2e211d4cb6b143\": container with ID starting with 2b9b6c854be137dee13e21cb59d34a27d871d68423fefb1abd2e211d4cb6b143 not found: ID does not exist" containerID="2b9b6c854be137dee13e21cb59d34a27d871d68423fefb1abd2e211d4cb6b143" Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.054787 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9b6c854be137dee13e21cb59d34a27d871d68423fefb1abd2e211d4cb6b143"} err="failed to get container status \"2b9b6c854be137dee13e21cb59d34a27d871d68423fefb1abd2e211d4cb6b143\": rpc error: code = NotFound desc = could not find container \"2b9b6c854be137dee13e21cb59d34a27d871d68423fefb1abd2e211d4cb6b143\": container with ID starting with 2b9b6c854be137dee13e21cb59d34a27d871d68423fefb1abd2e211d4cb6b143 not found: ID does not exist" Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.054800 4792 scope.go:117] "RemoveContainer" containerID="890c663ef81654e0c8835e88e65dda1315ef91e1db1e851434c844580c426e57" Mar 09 10:18:22 crc kubenswrapper[4792]: E0309 10:18:22.054976 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"890c663ef81654e0c8835e88e65dda1315ef91e1db1e851434c844580c426e57\": container with ID starting with 890c663ef81654e0c8835e88e65dda1315ef91e1db1e851434c844580c426e57 not found: ID does not exist" containerID="890c663ef81654e0c8835e88e65dda1315ef91e1db1e851434c844580c426e57" Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.054997 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"890c663ef81654e0c8835e88e65dda1315ef91e1db1e851434c844580c426e57"} err="failed to get container status \"890c663ef81654e0c8835e88e65dda1315ef91e1db1e851434c844580c426e57\": rpc error: code = NotFound desc = could not find container \"890c663ef81654e0c8835e88e65dda1315ef91e1db1e851434c844580c426e57\": container with ID starting with 890c663ef81654e0c8835e88e65dda1315ef91e1db1e851434c844580c426e57 not found: ID does not exist" Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.205807 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6w28d"] Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.222217 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6w28d"] Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.826557 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/crc-debug-dfmsr" Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.940926 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8526d\" (UniqueName: \"kubernetes.io/projected/8c35c7e5-6be1-4351-b3f4-032311a012ce-kube-api-access-8526d\") pod \"8c35c7e5-6be1-4351-b3f4-032311a012ce\" (UID: \"8c35c7e5-6be1-4351-b3f4-032311a012ce\") " Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.941003 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c35c7e5-6be1-4351-b3f4-032311a012ce-host\") pod \"8c35c7e5-6be1-4351-b3f4-032311a012ce\" (UID: \"8c35c7e5-6be1-4351-b3f4-032311a012ce\") " Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.941766 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c35c7e5-6be1-4351-b3f4-032311a012ce-host" (OuterVolumeSpecName: "host") pod "8c35c7e5-6be1-4351-b3f4-032311a012ce" (UID: "8c35c7e5-6be1-4351-b3f4-032311a012ce"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:18:22 crc kubenswrapper[4792]: I0309 10:18:22.947555 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c35c7e5-6be1-4351-b3f4-032311a012ce-kube-api-access-8526d" (OuterVolumeSpecName: "kube-api-access-8526d") pod "8c35c7e5-6be1-4351-b3f4-032311a012ce" (UID: "8c35c7e5-6be1-4351-b3f4-032311a012ce"). InnerVolumeSpecName "kube-api-access-8526d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:18:23 crc kubenswrapper[4792]: I0309 10:18:23.044615 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8526d\" (UniqueName: \"kubernetes.io/projected/8c35c7e5-6be1-4351-b3f4-032311a012ce-kube-api-access-8526d\") on node \"crc\" DevicePath \"\"" Mar 09 10:18:23 crc kubenswrapper[4792]: I0309 10:18:23.044649 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c35c7e5-6be1-4351-b3f4-032311a012ce-host\") on node \"crc\" DevicePath \"\"" Mar 09 10:18:23 crc kubenswrapper[4792]: I0309 10:18:23.676147 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c35c7e5-6be1-4351-b3f4-032311a012ce" path="/var/lib/kubelet/pods/8c35c7e5-6be1-4351-b3f4-032311a012ce/volumes" Mar 09 10:18:23 crc kubenswrapper[4792]: I0309 10:18:23.677091 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9afa966f-2aa0-4dc4-94f4-b33884c64c65" path="/var/lib/kubelet/pods/9afa966f-2aa0-4dc4-94f4-b33884c64c65/volumes" Mar 09 10:18:23 crc kubenswrapper[4792]: I0309 10:18:23.719444 4792 scope.go:117] "RemoveContainer" containerID="f1cab2ae304e3d85093a845385d8f05b5eefe763ab956c4534c4f306dba3b2cb" Mar 09 10:18:23 crc kubenswrapper[4792]: I0309 10:18:23.719683 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/crc-debug-dfmsr" Mar 09 10:18:43 crc kubenswrapper[4792]: I0309 10:18:43.214116 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:18:43 crc kubenswrapper[4792]: I0309 10:18:43.215610 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:18:44 crc kubenswrapper[4792]: I0309 10:18:44.323226 4792 scope.go:117] "RemoveContainer" containerID="f09c17d8d7d1d24f13768107f8ab8c0aecfc3d7b9451ed7de38cf73353c33fe1" Mar 09 10:19:13 crc kubenswrapper[4792]: I0309 10:19:13.213924 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:19:13 crc kubenswrapper[4792]: I0309 10:19:13.215827 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:19:13 crc kubenswrapper[4792]: I0309 10:19:13.215939 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 10:19:13 crc kubenswrapper[4792]: I0309 10:19:13.216741 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 10:19:13 crc kubenswrapper[4792]: I0309 10:19:13.216912 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" gracePeriod=600 Mar 09 10:19:13 crc kubenswrapper[4792]: E0309 10:19:13.353551 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:19:14 crc kubenswrapper[4792]: I0309 10:19:14.195138 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" exitCode=0 Mar 09 10:19:14 crc kubenswrapper[4792]: I0309 10:19:14.195346 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2"} Mar 09 10:19:14 crc kubenswrapper[4792]: I0309 10:19:14.195481 4792 scope.go:117] "RemoveContainer" containerID="61823a1cefc0652bd0396fa1996ffc2e59e5ea4df20bbaa1b20c0c3f986988b7" Mar 09 10:19:14 crc kubenswrapper[4792]: I0309 10:19:14.196220 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:19:14 crc kubenswrapper[4792]: E0309 10:19:14.196466 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:19:25 crc kubenswrapper[4792]: I0309 10:19:25.035291 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-78f7c77b76-nw94r_2b4c9d79-a45e-457d-be41-ea8535f122c6/barbican-api/0.log" Mar 09 10:19:25 crc kubenswrapper[4792]: I0309 10:19:25.269547 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-78f7c77b76-nw94r_2b4c9d79-a45e-457d-be41-ea8535f122c6/barbican-api-log/0.log" Mar 09 10:19:25 crc kubenswrapper[4792]: I0309 10:19:25.378937 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f45884c58-b4trg_f7122066-5687-409a-9d80-f39f2d96ad84/barbican-keystone-listener/0.log" Mar 09 10:19:25 crc kubenswrapper[4792]: I0309 10:19:25.443892 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f45884c58-b4trg_f7122066-5687-409a-9d80-f39f2d96ad84/barbican-keystone-listener-log/0.log" Mar 09 10:19:25 crc kubenswrapper[4792]: I0309 10:19:25.615881 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-68bdcc9765-czvxc_7997db4c-9ed8-438f-86b3-558a6ed2be44/barbican-worker/0.log" Mar 09 10:19:25 crc kubenswrapper[4792]: I0309 10:19:25.668664 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:19:25 crc kubenswrapper[4792]: E0309 10:19:25.669018 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:19:25 crc kubenswrapper[4792]: I0309 10:19:25.683872 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-68bdcc9765-czvxc_7997db4c-9ed8-438f-86b3-558a6ed2be44/barbican-worker-log/0.log" Mar 09 10:19:25 crc kubenswrapper[4792]: I0309 10:19:25.874888 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm_90585516-71d1-4289-8f0d-43884caee227/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:25 crc kubenswrapper[4792]: I0309 10:19:25.932743 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a1a37964-5fdf-4a05-bde5-750f454d2987/ceilometer-central-agent/0.log" Mar 09 10:19:25 crc kubenswrapper[4792]: I0309 10:19:25.991286 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a1a37964-5fdf-4a05-bde5-750f454d2987/ceilometer-notification-agent/0.log" Mar 09 10:19:26 crc kubenswrapper[4792]: I0309 10:19:26.174359 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a1a37964-5fdf-4a05-bde5-750f454d2987/sg-core/0.log" Mar 09 10:19:26 crc kubenswrapper[4792]: I0309 10:19:26.198352 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a1a37964-5fdf-4a05-bde5-750f454d2987/proxy-httpd/0.log" Mar 09 10:19:26 crc kubenswrapper[4792]: I0309 10:19:26.278829 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm_e8ade593-bf65-47e0-8be9-76c8fedc40a1/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:26 crc kubenswrapper[4792]: I0309 10:19:26.469019 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh_ff236fbb-03e6-4227-b10c-9dfeac266de8/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:26 crc kubenswrapper[4792]: I0309 10:19:26.669701 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d0b2c9c1-464f-4058-aa55-ce041668d8a2/cinder-api/0.log" Mar 09 10:19:26 crc kubenswrapper[4792]: I0309 10:19:26.746336 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d0b2c9c1-464f-4058-aa55-ce041668d8a2/cinder-api-log/0.log" Mar 09 10:19:26 crc kubenswrapper[4792]: I0309 10:19:26.997126 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2eaee6b3-8397-430c-b799-7628762d1701/cinder-backup/0.log" Mar 09 10:19:27 crc kubenswrapper[4792]: I0309 10:19:27.037953 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2eaee6b3-8397-430c-b799-7628762d1701/probe/0.log" Mar 09 10:19:27 crc kubenswrapper[4792]: I0309 10:19:27.126609 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c8c68c2d-fe77-41af-b4f4-8f83079bf316/cinder-scheduler/0.log" Mar 09 10:19:27 crc kubenswrapper[4792]: I0309 10:19:27.317339 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c8c68c2d-fe77-41af-b4f4-8f83079bf316/probe/0.log" Mar 09 10:19:27 crc kubenswrapper[4792]: I0309 10:19:27.397554 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_61d989fe-045d-4c58-b660-f9d0e1a482f9/cinder-volume/0.log" Mar 09 10:19:27 crc kubenswrapper[4792]: I0309 10:19:27.439571 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_61d989fe-045d-4c58-b660-f9d0e1a482f9/probe/0.log" Mar 09 10:19:27 crc kubenswrapper[4792]: I0309 10:19:27.626885 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt_1bedab89-65bc-478a-abf1-3e3429951e71/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:27 crc kubenswrapper[4792]: I0309 10:19:27.703873 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk_1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:27 crc kubenswrapper[4792]: I0309 10:19:27.879933 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-q4zw6_f4a082c9-9e44-4d1b-b361-3fe4af72fbe9/init/0.log" Mar 09 10:19:28 crc kubenswrapper[4792]: I0309 10:19:28.419384 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-q4zw6_f4a082c9-9e44-4d1b-b361-3fe4af72fbe9/init/0.log" Mar 09 10:19:28 crc kubenswrapper[4792]: I0309 10:19:28.442543 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-q4zw6_f4a082c9-9e44-4d1b-b361-3fe4af72fbe9/dnsmasq-dns/0.log" Mar 09 10:19:28 crc kubenswrapper[4792]: I0309 10:19:28.457777 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2fc0a824-2dfc-436e-ad6e-c0751afcb61f/glance-httpd/0.log" Mar 09 10:19:28 crc kubenswrapper[4792]: I0309 10:19:28.655678 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2fc0a824-2dfc-436e-ad6e-c0751afcb61f/glance-log/0.log" Mar 09 10:19:28 crc kubenswrapper[4792]: I0309 10:19:28.698264 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c/glance-log/0.log" Mar 09 10:19:28 crc kubenswrapper[4792]: I0309 10:19:28.745771 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c/glance-httpd/0.log" Mar 09 10:19:29 crc kubenswrapper[4792]: I0309 10:19:29.126846 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54c85f748d-wxdlf_d028a70e-dd9d-4b38-bb18-4cd55cd002fe/horizon/1.log" Mar 09 10:19:29 crc kubenswrapper[4792]: I0309 10:19:29.234391 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54c85f748d-wxdlf_d028a70e-dd9d-4b38-bb18-4cd55cd002fe/horizon-log/0.log" Mar 09 10:19:29 crc kubenswrapper[4792]: I0309 10:19:29.256885 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54c85f748d-wxdlf_d028a70e-dd9d-4b38-bb18-4cd55cd002fe/horizon/0.log" Mar 09 10:19:29 crc kubenswrapper[4792]: I0309 10:19:29.436522 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq_9ff0f9ff-023a-4679-a084-1d4ae368e02d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:29 crc kubenswrapper[4792]: I0309 10:19:29.556223 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xwl2x_58614884-08cd-4ea5-b45e-45a6157f16aa/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:29 crc kubenswrapper[4792]: I0309 10:19:29.885125 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5b488b889c-ks9th_063f2c66-7712-4aff-a002-fccc2821c91a/keystone-api/0.log" Mar 09 10:19:29 crc kubenswrapper[4792]: I0309 10:19:29.932178 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29550841-fkrc7_b83ae2a5-e733-497b-a5de-56d3a962dec5/keystone-cron/0.log" Mar 09 10:19:30 crc kubenswrapper[4792]: I0309 10:19:30.098927 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_da658112-b6e4-4493-a46a-0add09e299f6/kube-state-metrics/0.log" Mar 09 10:19:30 crc kubenswrapper[4792]: I0309 10:19:30.208260 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-4sx54_047ab0a5-633d-4731-a534-fd2db3b65b43/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:30 crc kubenswrapper[4792]: I0309 10:19:30.441558 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_74c28e16-49a6-429a-9a95-ae4a07e9cb8e/manila-api-log/0.log" Mar 09 10:19:30 crc kubenswrapper[4792]: I0309 10:19:30.526275 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_74c28e16-49a6-429a-9a95-ae4a07e9cb8e/manila-api/0.log" Mar 09 10:19:30 crc kubenswrapper[4792]: I0309 10:19:30.739750 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_87ea7216-a1e4-47b3-8303-d2af1a68f974/probe/0.log" Mar 09 10:19:30 crc kubenswrapper[4792]: I0309 10:19:30.838017 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_87ea7216-a1e4-47b3-8303-d2af1a68f974/manila-scheduler/0.log" Mar 09 10:19:30 crc kubenswrapper[4792]: I0309 10:19:30.979032 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_49c2b09f-818f-461b-9ebd-bc43d6e268c6/manila-share/0.log" Mar 09 10:19:31 crc kubenswrapper[4792]: I0309 10:19:31.575268 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_49c2b09f-818f-461b-9ebd-bc43d6e268c6/probe/0.log" Mar 09 10:19:31 crc kubenswrapper[4792]: I0309 10:19:31.944321 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6cbbcf5c8f-spnsr_4ad68345-e440-498d-a525-014a7db81ea6/neutron-api/0.log" Mar 09 10:19:32 crc kubenswrapper[4792]: I0309 10:19:32.174018 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h_2e835834-d2ca-414a-b567-8364c4b208e5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:32 crc kubenswrapper[4792]: I0309 10:19:32.330597 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6cbbcf5c8f-spnsr_4ad68345-e440-498d-a525-014a7db81ea6/neutron-httpd/0.log" Mar 09 10:19:32 crc kubenswrapper[4792]: I0309 10:19:32.866871 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_33f67afd-db61-4209-b505-8ec8edcabfc1/nova-cell0-conductor-conductor/0.log" Mar 09 10:19:32 crc kubenswrapper[4792]: I0309 10:19:32.923423 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_882a66ed-9e4e-4501-90ab-a600db85728a/nova-api-log/0.log" Mar 09 10:19:33 crc kubenswrapper[4792]: I0309 10:19:33.194656 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_882a66ed-9e4e-4501-90ab-a600db85728a/nova-api-api/0.log" Mar 09 10:19:33 crc kubenswrapper[4792]: I0309 10:19:33.667514 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8adff3b6-586a-445f-adee-c3f412b874c0/nova-cell1-novncproxy-novncproxy/0.log" Mar 09 10:19:33 crc kubenswrapper[4792]: I0309 10:19:33.722199 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_65141e32-9490-4e93-9338-c9878770172e/nova-cell1-conductor-conductor/0.log" Mar 09 10:19:34 crc kubenswrapper[4792]: I0309 10:19:34.022198 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr_50f74681-04e5-49c7-9d32-1e8841867bcb/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:34 crc kubenswrapper[4792]: I0309 10:19:34.195537 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ba905c80-a1c9-4e8b-9d19-965d91ffb934/nova-metadata-log/0.log" Mar 09 10:19:34 crc kubenswrapper[4792]: I0309 10:19:34.656912 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_de087a24-d54a-442c-8cbe-2cbe653c4343/nova-scheduler-scheduler/0.log" Mar 09 10:19:34 crc kubenswrapper[4792]: I0309 10:19:34.739492 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1731fe55-4bf2-4410-85f9-58124ed652c9/mysql-bootstrap/0.log" Mar 09 10:19:35 crc kubenswrapper[4792]: I0309 10:19:35.001926 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1731fe55-4bf2-4410-85f9-58124ed652c9/mysql-bootstrap/0.log" Mar 09 10:19:35 crc kubenswrapper[4792]: I0309 10:19:35.069812 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1731fe55-4bf2-4410-85f9-58124ed652c9/galera/0.log" Mar 09 10:19:35 crc kubenswrapper[4792]: I0309 10:19:35.257355 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7dd0ce66-42bf-4c00-8e99-3c58defcc87f/mysql-bootstrap/0.log" Mar 09 10:19:35 crc kubenswrapper[4792]: I0309 10:19:35.652631 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7dd0ce66-42bf-4c00-8e99-3c58defcc87f/galera/0.log" Mar 09 10:19:35 crc kubenswrapper[4792]: I0309 10:19:35.692207 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7dd0ce66-42bf-4c00-8e99-3c58defcc87f/mysql-bootstrap/0.log" Mar 09 10:19:35 crc kubenswrapper[4792]: I0309 10:19:35.819429 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ba905c80-a1c9-4e8b-9d19-965d91ffb934/nova-metadata-metadata/0.log" Mar 09 10:19:35 crc kubenswrapper[4792]: I0309 10:19:35.975042 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_09fc64e5-4201-410d-a764-789e1dc85ac0/openstackclient/0.log" Mar 09 10:19:35 crc kubenswrapper[4792]: I0309 10:19:35.992995 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-kj9d8_438d928b-7565-4fe1-a005-2c6402835edf/ovn-controller/0.log" Mar 09 10:19:36 crc kubenswrapper[4792]: I0309 10:19:36.339704 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gw65t_2fd40118-2613-4e01-a557-f7fc5f24e07c/ovsdb-server-init/0.log" Mar 09 10:19:36 crc kubenswrapper[4792]: I0309 10:19:36.457663 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mxwdc_9b94bbb1-5f6b-40c1-96b1-66a228166d91/openstack-network-exporter/0.log" Mar 09 10:19:36 crc kubenswrapper[4792]: I0309 10:19:36.704324 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gw65t_2fd40118-2613-4e01-a557-f7fc5f24e07c/ovs-vswitchd/0.log" Mar 09 10:19:36 crc kubenswrapper[4792]: I0309 10:19:36.746150 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gw65t_2fd40118-2613-4e01-a557-f7fc5f24e07c/ovsdb-server-init/0.log" Mar 09 10:19:36 crc kubenswrapper[4792]: I0309 10:19:36.857912 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gw65t_2fd40118-2613-4e01-a557-f7fc5f24e07c/ovsdb-server/0.log" Mar 09 10:19:37 crc kubenswrapper[4792]: I0309 10:19:37.144541 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7c2rh_c5a32778-1a93-440b-9f56-d0bded50a725/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:37 crc kubenswrapper[4792]: I0309 10:19:37.147864 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_58b87887-c8d6-4658-9f0e-3d94f414c14c/openstack-network-exporter/0.log" Mar 09 10:19:37 crc kubenswrapper[4792]: I0309 10:19:37.248630 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_58b87887-c8d6-4658-9f0e-3d94f414c14c/ovn-northd/0.log" Mar 09 10:19:37 crc kubenswrapper[4792]: I0309 10:19:37.428808 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_02a198ef-826d-49cf-a6c5-134da45ad28b/openstack-network-exporter/0.log" Mar 09 10:19:37 crc kubenswrapper[4792]: I0309 10:19:37.457371 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_02a198ef-826d-49cf-a6c5-134da45ad28b/ovsdbserver-nb/0.log" Mar 09 10:19:37 crc kubenswrapper[4792]: I0309 10:19:37.661944 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:19:37 crc kubenswrapper[4792]: E0309 10:19:37.662501 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:19:37 crc kubenswrapper[4792]: I0309 10:19:37.802416 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b99fdd60-0b01-4b3e-ad0b-0f32f7427f48/openstack-network-exporter/0.log" Mar 09 10:19:37 crc kubenswrapper[4792]: I0309 10:19:37.815407 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b99fdd60-0b01-4b3e-ad0b-0f32f7427f48/ovsdbserver-sb/0.log" Mar 09 10:19:38 crc kubenswrapper[4792]: I0309 10:19:38.133875 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d6ff87dd6-6wzmx_52f7c11a-3099-487b-9126-fd90d1db1aaa/placement-api/0.log" Mar 09 10:19:38 crc kubenswrapper[4792]: I0309 10:19:38.293967 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a20da79f-1b2b-4d52-bf44-4c6a9bf0f210/setup-container/0.log" Mar 09 10:19:38 crc kubenswrapper[4792]: I0309 10:19:38.333499 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d6ff87dd6-6wzmx_52f7c11a-3099-487b-9126-fd90d1db1aaa/placement-log/0.log" Mar 09 10:19:38 crc kubenswrapper[4792]: I0309 10:19:38.560096 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a20da79f-1b2b-4d52-bf44-4c6a9bf0f210/rabbitmq/0.log" Mar 09 10:19:38 crc kubenswrapper[4792]: I0309 10:19:38.575427 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a20da79f-1b2b-4d52-bf44-4c6a9bf0f210/setup-container/0.log" Mar 09 10:19:38 crc kubenswrapper[4792]: I0309 10:19:38.694327 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a994be4-9a88-4ee6-8e24-a6d62898f593/setup-container/0.log" Mar 09 10:19:38 crc kubenswrapper[4792]: I0309 10:19:38.870341 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a994be4-9a88-4ee6-8e24-a6d62898f593/setup-container/0.log" Mar 09 10:19:38 crc kubenswrapper[4792]: I0309 10:19:38.966431 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a994be4-9a88-4ee6-8e24-a6d62898f593/rabbitmq/0.log" Mar 09 10:19:39 crc kubenswrapper[4792]: I0309 10:19:39.043784 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5_f84f1271-7155-48fc-a6f0-d1777cb75ac5/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:39 crc kubenswrapper[4792]: I0309 10:19:39.335430 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l_bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:39 crc kubenswrapper[4792]: I0309 10:19:39.427601 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-67gl6_afd25149-8416-4a5c-a84a-b63961a5e1f9/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:39 crc kubenswrapper[4792]: I0309 10:19:39.707685 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jzwbl_48ec7142-4c09-4a5f-8202-aaf16bb97b26/ssh-known-hosts-edpm-deployment/0.log" Mar 09 10:19:39 crc kubenswrapper[4792]: I0309 10:19:39.791063 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_152f601f-0625-4503-a057-26316d8504aa/tempest-tests-tempest-tests-runner/0.log" Mar 09 10:19:40 crc kubenswrapper[4792]: I0309 10:19:40.017200 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_69889849-285a-4c47-a955-3e681990d59e/test-operator-logs-container/0.log" Mar 09 10:19:40 crc kubenswrapper[4792]: I0309 10:19:40.184088 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-vwq86_c83b88c6-39ae-4077-84b9-e10f71a53d6e/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:19:48 crc kubenswrapper[4792]: I0309 10:19:48.663348 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:19:48 crc kubenswrapper[4792]: E0309 10:19:48.664120 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:19:51 crc kubenswrapper[4792]: I0309 10:19:51.958926 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_22afdfd4-ea58-4efb-b316-bcb40c906952/memcached/0.log" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.140365 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550860-mf978"] Mar 09 10:20:00 crc kubenswrapper[4792]: E0309 10:20:00.142292 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c35c7e5-6be1-4351-b3f4-032311a012ce" containerName="container-00" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.142407 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c35c7e5-6be1-4351-b3f4-032311a012ce" containerName="container-00" Mar 09 10:20:00 crc kubenswrapper[4792]: E0309 10:20:00.142517 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9afa966f-2aa0-4dc4-94f4-b33884c64c65" containerName="extract-content" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.142597 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9afa966f-2aa0-4dc4-94f4-b33884c64c65" containerName="extract-content" Mar 09 10:20:00 crc kubenswrapper[4792]: E0309 10:20:00.142670 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9afa966f-2aa0-4dc4-94f4-b33884c64c65" containerName="extract-utilities" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.142733 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9afa966f-2aa0-4dc4-94f4-b33884c64c65" containerName="extract-utilities" Mar 09 10:20:00 crc kubenswrapper[4792]: E0309 10:20:00.142798 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9afa966f-2aa0-4dc4-94f4-b33884c64c65" containerName="registry-server" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.142854 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="9afa966f-2aa0-4dc4-94f4-b33884c64c65" containerName="registry-server" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.143122 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c35c7e5-6be1-4351-b3f4-032311a012ce" containerName="container-00" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.143258 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="9afa966f-2aa0-4dc4-94f4-b33884c64c65" containerName="registry-server" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.144907 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550860-mf978" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.147681 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.147693 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.154231 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550860-mf978"] Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.156712 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.251692 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zdsw\" (UniqueName: \"kubernetes.io/projected/6306ab16-18db-41a2-97f5-4d56712f91c2-kube-api-access-6zdsw\") pod \"auto-csr-approver-29550860-mf978\" (UID: \"6306ab16-18db-41a2-97f5-4d56712f91c2\") " pod="openshift-infra/auto-csr-approver-29550860-mf978" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.353896 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zdsw\" (UniqueName: \"kubernetes.io/projected/6306ab16-18db-41a2-97f5-4d56712f91c2-kube-api-access-6zdsw\") pod \"auto-csr-approver-29550860-mf978\" (UID: \"6306ab16-18db-41a2-97f5-4d56712f91c2\") " pod="openshift-infra/auto-csr-approver-29550860-mf978" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.374530 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zdsw\" (UniqueName: \"kubernetes.io/projected/6306ab16-18db-41a2-97f5-4d56712f91c2-kube-api-access-6zdsw\") pod \"auto-csr-approver-29550860-mf978\" (UID: \"6306ab16-18db-41a2-97f5-4d56712f91c2\") " pod="openshift-infra/auto-csr-approver-29550860-mf978" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.474882 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550860-mf978" Mar 09 10:20:00 crc kubenswrapper[4792]: I0309 10:20:00.980143 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550860-mf978"] Mar 09 10:20:01 crc kubenswrapper[4792]: I0309 10:20:01.682761 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:20:01 crc kubenswrapper[4792]: E0309 10:20:01.685244 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:20:01 crc kubenswrapper[4792]: I0309 10:20:01.686749 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550860-mf978" event={"ID":"6306ab16-18db-41a2-97f5-4d56712f91c2","Type":"ContainerStarted","Data":"0d7ef3205fe76c38a056f196bac9a6ab47e5004bf4357493c582318cb26cb403"} Mar 09 10:20:03 crc kubenswrapper[4792]: I0309 10:20:03.691878 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550860-mf978" event={"ID":"6306ab16-18db-41a2-97f5-4d56712f91c2","Type":"ContainerStarted","Data":"adb11d98d9f9d769b445e4926cbc174e35edbb3cf94a6cdc17240ec02b4807f7"} Mar 09 10:20:03 crc kubenswrapper[4792]: I0309 10:20:03.711342 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550860-mf978" podStartSLOduration=1.785976755 podStartE2EDuration="3.711322555s" podCreationTimestamp="2026-03-09 10:20:00 +0000 UTC" firstStartedPulling="2026-03-09 10:20:01.365893456 +0000 UTC m=+4366.396094208" lastFinishedPulling="2026-03-09 10:20:03.291239256 +0000 UTC m=+4368.321440008" observedRunningTime="2026-03-09 10:20:03.707475572 +0000 UTC m=+4368.737676354" watchObservedRunningTime="2026-03-09 10:20:03.711322555 +0000 UTC m=+4368.741523307" Mar 09 10:20:04 crc kubenswrapper[4792]: I0309 10:20:04.704667 4792 generic.go:334] "Generic (PLEG): container finished" podID="6306ab16-18db-41a2-97f5-4d56712f91c2" containerID="adb11d98d9f9d769b445e4926cbc174e35edbb3cf94a6cdc17240ec02b4807f7" exitCode=0 Mar 09 10:20:04 crc kubenswrapper[4792]: I0309 10:20:04.704746 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550860-mf978" event={"ID":"6306ab16-18db-41a2-97f5-4d56712f91c2","Type":"ContainerDied","Data":"adb11d98d9f9d769b445e4926cbc174e35edbb3cf94a6cdc17240ec02b4807f7"} Mar 09 10:20:06 crc kubenswrapper[4792]: I0309 10:20:06.040956 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550860-mf978" Mar 09 10:20:06 crc kubenswrapper[4792]: I0309 10:20:06.176116 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zdsw\" (UniqueName: \"kubernetes.io/projected/6306ab16-18db-41a2-97f5-4d56712f91c2-kube-api-access-6zdsw\") pod \"6306ab16-18db-41a2-97f5-4d56712f91c2\" (UID: \"6306ab16-18db-41a2-97f5-4d56712f91c2\") " Mar 09 10:20:06 crc kubenswrapper[4792]: I0309 10:20:06.185682 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6306ab16-18db-41a2-97f5-4d56712f91c2-kube-api-access-6zdsw" (OuterVolumeSpecName: "kube-api-access-6zdsw") pod "6306ab16-18db-41a2-97f5-4d56712f91c2" (UID: "6306ab16-18db-41a2-97f5-4d56712f91c2"). InnerVolumeSpecName "kube-api-access-6zdsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:20:06 crc kubenswrapper[4792]: I0309 10:20:06.281618 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zdsw\" (UniqueName: \"kubernetes.io/projected/6306ab16-18db-41a2-97f5-4d56712f91c2-kube-api-access-6zdsw\") on node \"crc\" DevicePath \"\"" Mar 09 10:20:06 crc kubenswrapper[4792]: I0309 10:20:06.723804 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550860-mf978" event={"ID":"6306ab16-18db-41a2-97f5-4d56712f91c2","Type":"ContainerDied","Data":"0d7ef3205fe76c38a056f196bac9a6ab47e5004bf4357493c582318cb26cb403"} Mar 09 10:20:06 crc kubenswrapper[4792]: I0309 10:20:06.723840 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d7ef3205fe76c38a056f196bac9a6ab47e5004bf4357493c582318cb26cb403" Mar 09 10:20:06 crc kubenswrapper[4792]: I0309 10:20:06.724201 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550860-mf978" Mar 09 10:20:06 crc kubenswrapper[4792]: I0309 10:20:06.789784 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550854-4h7rq"] Mar 09 10:20:06 crc kubenswrapper[4792]: I0309 10:20:06.799335 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550854-4h7rq"] Mar 09 10:20:07 crc kubenswrapper[4792]: I0309 10:20:07.677104 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919cdc0a-fbb2-4e1e-9849-501d2191f074" path="/var/lib/kubelet/pods/919cdc0a-fbb2-4e1e-9849-501d2191f074/volumes" Mar 09 10:20:13 crc kubenswrapper[4792]: I0309 10:20:13.665417 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:20:13 crc kubenswrapper[4792]: E0309 10:20:13.666265 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:20:14 crc kubenswrapper[4792]: I0309 10:20:14.205494 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-r5d6p_20b2fb83-c944-4553-b506-9ff3c9c199f5/manager/0.log" Mar 09 10:20:14 crc kubenswrapper[4792]: I0309 10:20:14.533384 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/util/0.log" Mar 09 10:20:14 crc kubenswrapper[4792]: I0309 10:20:14.921058 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/util/0.log" Mar 09 10:20:14 crc kubenswrapper[4792]: I0309 10:20:14.939940 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/pull/0.log" Mar 09 10:20:15 crc kubenswrapper[4792]: I0309 10:20:15.257800 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/pull/0.log" Mar 09 10:20:15 crc kubenswrapper[4792]: I0309 10:20:15.489462 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/util/0.log" Mar 09 10:20:15 crc kubenswrapper[4792]: I0309 10:20:15.506439 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/pull/0.log" Mar 09 10:20:15 crc kubenswrapper[4792]: I0309 10:20:15.797387 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/extract/0.log" Mar 09 10:20:16 crc kubenswrapper[4792]: I0309 10:20:16.173721 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-tfz6b_b74999f3-cb46-4b35-a70f-71977b54d944/manager/0.log" Mar 09 10:20:16 crc kubenswrapper[4792]: I0309 10:20:16.347287 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-lw2kp_b1140422-6cf3-4e92-95e2-6ea31179de28/manager/0.log" Mar 09 10:20:16 crc kubenswrapper[4792]: I0309 10:20:16.361987 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-7fdsl_89b0f1f9-11f1-4d01-a2b8-ca2f1fae3bb2/manager/0.log" Mar 09 10:20:16 crc kubenswrapper[4792]: I0309 10:20:16.688198 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-jhmcx_98ba9a2a-30d6-45f2-af47-2994c292fe05/manager/0.log" Mar 09 10:20:17 crc kubenswrapper[4792]: I0309 10:20:17.123215 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-swtlr_fe547e1c-cb50-4541-b867-5154dae69ec3/manager/0.log" Mar 09 10:20:17 crc kubenswrapper[4792]: I0309 10:20:17.128745 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-x7n9b_ac60ffe8-71d2-4ea1-bbc5-d377fc70d940/manager/0.log" Mar 09 10:20:17 crc kubenswrapper[4792]: I0309 10:20:17.570021 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-dktrj_55f715a3-ef6e-40d8-9f9b-3100b2847b8d/manager/0.log" Mar 09 10:20:17 crc kubenswrapper[4792]: I0309 10:20:17.677248 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-4775c_2c678a62-a744-4384-8403-618b566ed91e/manager/0.log" Mar 09 10:20:17 crc kubenswrapper[4792]: I0309 10:20:17.928545 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-ckrbc_e27b7b35-b064-4e02-99e6-cb34af5ff0e9/manager/0.log" Mar 09 10:20:18 crc kubenswrapper[4792]: I0309 10:20:18.247676 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-5k4db_8fd39edc-ff27-4feb-b138-ee11a440c0ca/manager/0.log" Mar 09 10:20:18 crc kubenswrapper[4792]: I0309 10:20:18.519619 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-dpvjg_9063ee68-9840-4f35-8d4d-44ab947477d5/manager/0.log" Mar 09 10:20:18 crc kubenswrapper[4792]: I0309 10:20:18.632842 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-r44dt_c28488b2-919b-4307-9a70-b2f5f1280e2a/manager/0.log" Mar 09 10:20:18 crc kubenswrapper[4792]: I0309 10:20:18.891010 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4_9ca7aa92-3367-4c2e-a86e-33ba41fe81cb/manager/0.log" Mar 09 10:20:19 crc kubenswrapper[4792]: I0309 10:20:19.204501 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-568b7cf6db-hz254_03eb7926-dd55-4d02-a695-5abcb5a02cdc/operator/0.log" Mar 09 10:20:19 crc kubenswrapper[4792]: I0309 10:20:19.783294 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rjx9b_93e20f26-20b1-409a-8663-61cd1a7a71d3/registry-server/0.log" Mar 09 10:20:19 crc kubenswrapper[4792]: I0309 10:20:19.857788 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-8vt8s_82689eba-1f75-4e2e-8c27-a5b90e2805af/manager/0.log" Mar 09 10:20:20 crc kubenswrapper[4792]: I0309 10:20:20.142047 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-z5tts_d53acf43-fee2-4bdf-9cdb-883641a56d48/manager/0.log" Mar 09 10:20:20 crc kubenswrapper[4792]: I0309 10:20:20.196795 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-kjtwp_92a6c902-5189-421e-b1a1-ed3e64e7bca4/operator/0.log" Mar 09 10:20:21 crc kubenswrapper[4792]: I0309 10:20:21.085279 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-vhf7p_533287c3-78f0-46ea-baa9-fafb1ce7615b/manager/0.log" Mar 09 10:20:21 crc kubenswrapper[4792]: I0309 10:20:21.541982 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-z4lgh_e56405f7-7121-4d52-b276-3feeddabd667/manager/0.log" Mar 09 10:20:21 crc kubenswrapper[4792]: I0309 10:20:21.645138 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-mzpqx_d4313901-b530-42e8-a975-d21aefbc0506/manager/0.log" Mar 09 10:20:21 crc kubenswrapper[4792]: I0309 10:20:21.803393 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-vj8ds_41f3c31e-77a7-4912-a933-04b32c0db0dc/manager/0.log" Mar 09 10:20:21 crc kubenswrapper[4792]: I0309 10:20:21.833782 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-59b6c9788f-qh4rf_e42c0d5f-7c0c-420f-a14b-59316b524101/manager/0.log" Mar 09 10:20:24 crc kubenswrapper[4792]: I0309 10:20:24.735887 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-9jfp7_d9dc8da2-0584-4db0-ad3a-f1c59c2f6028/manager/0.log" Mar 09 10:20:25 crc kubenswrapper[4792]: I0309 10:20:25.669147 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:20:25 crc kubenswrapper[4792]: E0309 10:20:25.669503 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:20:38 crc kubenswrapper[4792]: I0309 10:20:38.663379 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:20:38 crc kubenswrapper[4792]: E0309 10:20:38.664766 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:20:44 crc kubenswrapper[4792]: I0309 10:20:44.467234 4792 scope.go:117] "RemoveContainer" containerID="d95914f38a9b5989724b30912d9d74e9e6221c985a6bd54510911176f25f7bf3" Mar 09 10:20:45 crc kubenswrapper[4792]: I0309 10:20:45.197757 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-fzxrs_f24bba0a-6535-4ad8-8aa7-86a71a268334/control-plane-machine-set-operator/0.log" Mar 09 10:20:45 crc kubenswrapper[4792]: I0309 10:20:45.397517 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-56b7z_fc2b2079-7189-4ca3-b398-2a1146b9c70f/kube-rbac-proxy/0.log" Mar 09 10:20:45 crc kubenswrapper[4792]: I0309 10:20:45.451203 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-56b7z_fc2b2079-7189-4ca3-b398-2a1146b9c70f/machine-api-operator/0.log" Mar 09 10:20:53 crc kubenswrapper[4792]: I0309 10:20:53.664505 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:20:53 crc kubenswrapper[4792]: E0309 10:20:53.665565 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:20:58 crc kubenswrapper[4792]: I0309 10:20:58.979523 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-72tj7_a6ad9459-0185-47b2-aebd-5a5a40554946/cert-manager-controller/0.log" Mar 09 10:20:59 crc kubenswrapper[4792]: I0309 10:20:59.118480 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-xfgbm_8c00ef29-8d91-4045-982c-8b4a6e98576b/cert-manager-cainjector/0.log" Mar 09 10:20:59 crc kubenswrapper[4792]: I0309 10:20:59.220687 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-7cft4_68e071be-fad9-4996-a83f-cd58058fe0f3/cert-manager-webhook/0.log" Mar 09 10:21:05 crc kubenswrapper[4792]: I0309 10:21:05.669546 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:21:05 crc kubenswrapper[4792]: E0309 10:21:05.670335 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:21:14 crc kubenswrapper[4792]: I0309 10:21:14.408105 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-5hgb5_dffb3a22-ee53-4b05-921e-bf92456a5518/nmstate-console-plugin/0.log" Mar 09 10:21:14 crc kubenswrapper[4792]: I0309 10:21:14.440961 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5bwpq_7370c580-bd4f-4659-8fe6-79d9f8b31c05/nmstate-handler/0.log" Mar 09 10:21:14 crc kubenswrapper[4792]: I0309 10:21:14.601284 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-rwpn2_0681a6fd-5531-4a3c-b2d8-59dfecd186c2/kube-rbac-proxy/0.log" Mar 09 10:21:14 crc kubenswrapper[4792]: I0309 10:21:14.640869 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-rwpn2_0681a6fd-5531-4a3c-b2d8-59dfecd186c2/nmstate-metrics/0.log" Mar 09 10:21:14 crc kubenswrapper[4792]: I0309 10:21:14.777066 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-rqfwt_c08f74f8-f8d6-48e8-bde0-9369c92969b0/nmstate-operator/0.log" Mar 09 10:21:14 crc kubenswrapper[4792]: I0309 10:21:14.921532 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-bsxt5_a8fdeb8b-8024-4916-b835-83a6da0b4ced/nmstate-webhook/0.log" Mar 09 10:21:19 crc kubenswrapper[4792]: I0309 10:21:19.664388 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:21:19 crc kubenswrapper[4792]: E0309 10:21:19.665184 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:21:30 crc kubenswrapper[4792]: I0309 10:21:30.662363 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:21:30 crc kubenswrapper[4792]: E0309 10:21:30.663109 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:21:42 crc kubenswrapper[4792]: I0309 10:21:42.662620 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:21:42 crc kubenswrapper[4792]: E0309 10:21:42.663518 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:21:45 crc kubenswrapper[4792]: I0309 10:21:45.107331 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-2lfdv_98a5f3b6-5d33-4542-9382-ea1d94e5f59f/kube-rbac-proxy/0.log" Mar 09 10:21:45 crc kubenswrapper[4792]: I0309 10:21:45.266418 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-2lfdv_98a5f3b6-5d33-4542-9382-ea1d94e5f59f/controller/0.log" Mar 09 10:21:45 crc kubenswrapper[4792]: I0309 10:21:45.639578 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-frr-files/0.log" Mar 09 10:21:45 crc kubenswrapper[4792]: I0309 10:21:45.915734 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-reloader/0.log" Mar 09 10:21:45 crc kubenswrapper[4792]: I0309 10:21:45.916694 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-frr-files/0.log" Mar 09 10:21:45 crc kubenswrapper[4792]: I0309 10:21:45.978962 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-metrics/0.log" Mar 09 10:21:46 crc kubenswrapper[4792]: I0309 10:21:46.012288 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-reloader/0.log" Mar 09 10:21:46 crc kubenswrapper[4792]: I0309 10:21:46.175714 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-frr-files/0.log" Mar 09 10:21:46 crc kubenswrapper[4792]: I0309 10:21:46.196995 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-reloader/0.log" Mar 09 10:21:46 crc kubenswrapper[4792]: I0309 10:21:46.316590 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-metrics/0.log" Mar 09 10:21:46 crc kubenswrapper[4792]: I0309 10:21:46.343227 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-metrics/0.log" Mar 09 10:21:46 crc kubenswrapper[4792]: I0309 10:21:46.542851 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-frr-files/0.log" Mar 09 10:21:46 crc kubenswrapper[4792]: I0309 10:21:46.549982 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-reloader/0.log" Mar 09 10:21:46 crc kubenswrapper[4792]: I0309 10:21:46.573801 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-metrics/0.log" Mar 09 10:21:46 crc kubenswrapper[4792]: I0309 10:21:46.639437 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/controller/0.log" Mar 09 10:21:46 crc kubenswrapper[4792]: I0309 10:21:46.764303 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/frr-metrics/0.log" Mar 09 10:21:46 crc kubenswrapper[4792]: I0309 10:21:46.906686 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/kube-rbac-proxy-frr/0.log" Mar 09 10:21:46 crc kubenswrapper[4792]: I0309 10:21:46.912386 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/kube-rbac-proxy/0.log" Mar 09 10:21:47 crc kubenswrapper[4792]: I0309 10:21:47.055959 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/reloader/0.log" Mar 09 10:21:47 crc kubenswrapper[4792]: I0309 10:21:47.280966 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-gm7zm_7ac24411-eccd-496a-8a49-d9b552a92691/frr-k8s-webhook-server/0.log" Mar 09 10:21:47 crc kubenswrapper[4792]: I0309 10:21:47.631798 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5d5f56c665-gxjds_5cb925f9-fcd8-47a5-8959-76bfdbbc2979/manager/0.log" Mar 09 10:21:47 crc kubenswrapper[4792]: I0309 10:21:47.979171 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6d4cf89d46-x6c57_491ea032-e688-454c-a67d-09966007bb7f/webhook-server/0.log" Mar 09 10:21:48 crc kubenswrapper[4792]: I0309 10:21:48.354451 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9fj2f_6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9/kube-rbac-proxy/0.log" Mar 09 10:21:48 crc kubenswrapper[4792]: I0309 10:21:48.494251 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/frr/0.log" Mar 09 10:21:48 crc kubenswrapper[4792]: I0309 10:21:48.736762 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9fj2f_6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9/speaker/0.log" Mar 09 10:21:57 crc kubenswrapper[4792]: I0309 10:21:57.662209 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:21:57 crc kubenswrapper[4792]: E0309 10:21:57.662923 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:22:00 crc kubenswrapper[4792]: I0309 10:22:00.149633 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550862-9rhtt"] Mar 09 10:22:00 crc kubenswrapper[4792]: E0309 10:22:00.150479 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6306ab16-18db-41a2-97f5-4d56712f91c2" containerName="oc" Mar 09 10:22:00 crc kubenswrapper[4792]: I0309 10:22:00.150494 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="6306ab16-18db-41a2-97f5-4d56712f91c2" containerName="oc" Mar 09 10:22:00 crc kubenswrapper[4792]: I0309 10:22:00.150748 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="6306ab16-18db-41a2-97f5-4d56712f91c2" containerName="oc" Mar 09 10:22:00 crc kubenswrapper[4792]: I0309 10:22:00.151526 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550862-9rhtt" Mar 09 10:22:00 crc kubenswrapper[4792]: I0309 10:22:00.154412 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:22:00 crc kubenswrapper[4792]: I0309 10:22:00.154792 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:22:00 crc kubenswrapper[4792]: I0309 10:22:00.157428 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:22:00 crc kubenswrapper[4792]: I0309 10:22:00.226314 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550862-9rhtt"] Mar 09 10:22:00 crc kubenswrapper[4792]: I0309 10:22:00.280436 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9hnr\" (UniqueName: \"kubernetes.io/projected/cd4d06a7-d4c8-4ac1-ba1b-805062fe1835-kube-api-access-r9hnr\") pod \"auto-csr-approver-29550862-9rhtt\" (UID: \"cd4d06a7-d4c8-4ac1-ba1b-805062fe1835\") " pod="openshift-infra/auto-csr-approver-29550862-9rhtt" Mar 09 10:22:00 crc kubenswrapper[4792]: I0309 10:22:00.382659 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9hnr\" (UniqueName: \"kubernetes.io/projected/cd4d06a7-d4c8-4ac1-ba1b-805062fe1835-kube-api-access-r9hnr\") pod \"auto-csr-approver-29550862-9rhtt\" (UID: \"cd4d06a7-d4c8-4ac1-ba1b-805062fe1835\") " pod="openshift-infra/auto-csr-approver-29550862-9rhtt" Mar 09 10:22:00 crc kubenswrapper[4792]: I0309 10:22:00.420225 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9hnr\" (UniqueName: \"kubernetes.io/projected/cd4d06a7-d4c8-4ac1-ba1b-805062fe1835-kube-api-access-r9hnr\") pod \"auto-csr-approver-29550862-9rhtt\" (UID: \"cd4d06a7-d4c8-4ac1-ba1b-805062fe1835\") " pod="openshift-infra/auto-csr-approver-29550862-9rhtt" Mar 09 10:22:00 crc kubenswrapper[4792]: I0309 10:22:00.478517 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550862-9rhtt" Mar 09 10:22:01 crc kubenswrapper[4792]: I0309 10:22:01.065677 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550862-9rhtt"] Mar 09 10:22:01 crc kubenswrapper[4792]: I0309 10:22:01.072893 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 10:22:01 crc kubenswrapper[4792]: I0309 10:22:01.680238 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550862-9rhtt" event={"ID":"cd4d06a7-d4c8-4ac1-ba1b-805062fe1835","Type":"ContainerStarted","Data":"2d2071c251670419ec0955c454d75c229035163a8639c390e912ef7c0b834194"} Mar 09 10:22:02 crc kubenswrapper[4792]: I0309 10:22:02.690457 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550862-9rhtt" event={"ID":"cd4d06a7-d4c8-4ac1-ba1b-805062fe1835","Type":"ContainerStarted","Data":"fdf3f73924507040d48037e097e8806e94be759a110c81dc5bb9e6b7923f7b80"} Mar 09 10:22:02 crc kubenswrapper[4792]: I0309 10:22:02.712487 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550862-9rhtt" podStartSLOduration=1.749177744 podStartE2EDuration="2.712465255s" podCreationTimestamp="2026-03-09 10:22:00 +0000 UTC" firstStartedPulling="2026-03-09 10:22:01.072703141 +0000 UTC m=+4486.102903893" lastFinishedPulling="2026-03-09 10:22:02.035990652 +0000 UTC m=+4487.066191404" observedRunningTime="2026-03-09 10:22:02.703764511 +0000 UTC m=+4487.733965283" watchObservedRunningTime="2026-03-09 10:22:02.712465255 +0000 UTC m=+4487.742666007" Mar 09 10:22:03 crc kubenswrapper[4792]: I0309 10:22:03.708561 4792 generic.go:334] "Generic (PLEG): container finished" podID="cd4d06a7-d4c8-4ac1-ba1b-805062fe1835" containerID="fdf3f73924507040d48037e097e8806e94be759a110c81dc5bb9e6b7923f7b80" exitCode=0 Mar 09 10:22:03 crc kubenswrapper[4792]: I0309 10:22:03.708611 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550862-9rhtt" event={"ID":"cd4d06a7-d4c8-4ac1-ba1b-805062fe1835","Type":"ContainerDied","Data":"fdf3f73924507040d48037e097e8806e94be759a110c81dc5bb9e6b7923f7b80"} Mar 09 10:22:03 crc kubenswrapper[4792]: I0309 10:22:03.916732 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/util/0.log" Mar 09 10:22:04 crc kubenswrapper[4792]: I0309 10:22:04.156233 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/util/0.log" Mar 09 10:22:04 crc kubenswrapper[4792]: I0309 10:22:04.199389 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/pull/0.log" Mar 09 10:22:04 crc kubenswrapper[4792]: I0309 10:22:04.238709 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/pull/0.log" Mar 09 10:22:04 crc kubenswrapper[4792]: I0309 10:22:04.332934 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/util/0.log" Mar 09 10:22:04 crc kubenswrapper[4792]: I0309 10:22:04.397248 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/pull/0.log" Mar 09 10:22:04 crc kubenswrapper[4792]: I0309 10:22:04.555174 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/extract/0.log" Mar 09 10:22:04 crc kubenswrapper[4792]: I0309 10:22:04.671213 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/extract-utilities/0.log" Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.253013 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550862-9rhtt" Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.384060 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9hnr\" (UniqueName: \"kubernetes.io/projected/cd4d06a7-d4c8-4ac1-ba1b-805062fe1835-kube-api-access-r9hnr\") pod \"cd4d06a7-d4c8-4ac1-ba1b-805062fe1835\" (UID: \"cd4d06a7-d4c8-4ac1-ba1b-805062fe1835\") " Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.397668 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd4d06a7-d4c8-4ac1-ba1b-805062fe1835-kube-api-access-r9hnr" (OuterVolumeSpecName: "kube-api-access-r9hnr") pod "cd4d06a7-d4c8-4ac1-ba1b-805062fe1835" (UID: "cd4d06a7-d4c8-4ac1-ba1b-805062fe1835"). InnerVolumeSpecName "kube-api-access-r9hnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.404783 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/extract-content/0.log" Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.453893 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/extract-utilities/0.log" Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.488511 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9hnr\" (UniqueName: \"kubernetes.io/projected/cd4d06a7-d4c8-4ac1-ba1b-805062fe1835-kube-api-access-r9hnr\") on node \"crc\" DevicePath \"\"" Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.505487 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/extract-content/0.log" Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.724393 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/extract-content/0.log" Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.725168 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550862-9rhtt" event={"ID":"cd4d06a7-d4c8-4ac1-ba1b-805062fe1835","Type":"ContainerDied","Data":"2d2071c251670419ec0955c454d75c229035163a8639c390e912ef7c0b834194"} Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.725198 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d2071c251670419ec0955c454d75c229035163a8639c390e912ef7c0b834194" Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.725208 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550862-9rhtt" Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.783601 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550856-gt46k"] Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.792050 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550856-gt46k"] Mar 09 10:22:05 crc kubenswrapper[4792]: I0309 10:22:05.818266 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/extract-utilities/0.log" Mar 09 10:22:06 crc kubenswrapper[4792]: I0309 10:22:06.044922 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/extract-utilities/0.log" Mar 09 10:22:06 crc kubenswrapper[4792]: I0309 10:22:06.359515 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/extract-content/0.log" Mar 09 10:22:06 crc kubenswrapper[4792]: I0309 10:22:06.427701 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/extract-utilities/0.log" Mar 09 10:22:06 crc kubenswrapper[4792]: I0309 10:22:06.490254 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/extract-content/0.log" Mar 09 10:22:06 crc kubenswrapper[4792]: I0309 10:22:06.671494 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/registry-server/0.log" Mar 09 10:22:06 crc kubenswrapper[4792]: I0309 10:22:06.703614 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/extract-content/0.log" Mar 09 10:22:06 crc kubenswrapper[4792]: I0309 10:22:06.711078 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/extract-utilities/0.log" Mar 09 10:22:06 crc kubenswrapper[4792]: I0309 10:22:06.954650 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/registry-server/0.log" Mar 09 10:22:07 crc kubenswrapper[4792]: I0309 10:22:07.073308 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/util/0.log" Mar 09 10:22:07 crc kubenswrapper[4792]: I0309 10:22:07.243932 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/util/0.log" Mar 09 10:22:07 crc kubenswrapper[4792]: I0309 10:22:07.274946 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/pull/0.log" Mar 09 10:22:07 crc kubenswrapper[4792]: I0309 10:22:07.275204 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/pull/0.log" Mar 09 10:22:07 crc kubenswrapper[4792]: I0309 10:22:07.464796 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/extract/0.log" Mar 09 10:22:07 crc kubenswrapper[4792]: I0309 10:22:07.470733 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/util/0.log" Mar 09 10:22:07 crc kubenswrapper[4792]: I0309 10:22:07.516757 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/pull/0.log" Mar 09 10:22:07 crc kubenswrapper[4792]: I0309 10:22:07.679364 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d6544d3-58e5-499b-bd88-e4a761d166a9" path="/var/lib/kubelet/pods/6d6544d3-58e5-499b-bd88-e4a761d166a9/volumes" Mar 09 10:22:07 crc kubenswrapper[4792]: I0309 10:22:07.745716 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-m64ct_0589998d-961b-4184-9884-0ad5eee48348/marketplace-operator/0.log" Mar 09 10:22:07 crc kubenswrapper[4792]: I0309 10:22:07.801772 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/extract-utilities/0.log" Mar 09 10:22:07 crc kubenswrapper[4792]: I0309 10:22:07.989577 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/extract-content/0.log" Mar 09 10:22:07 crc kubenswrapper[4792]: I0309 10:22:07.998226 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/extract-content/0.log" Mar 09 10:22:08 crc kubenswrapper[4792]: I0309 10:22:08.044023 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/extract-utilities/0.log" Mar 09 10:22:08 crc kubenswrapper[4792]: I0309 10:22:08.222353 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/extract-content/0.log" Mar 09 10:22:08 crc kubenswrapper[4792]: I0309 10:22:08.296024 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/extract-utilities/0.log" Mar 09 10:22:08 crc kubenswrapper[4792]: I0309 10:22:08.362540 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/registry-server/0.log" Mar 09 10:22:08 crc kubenswrapper[4792]: I0309 10:22:08.502522 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/extract-utilities/0.log" Mar 09 10:22:08 crc kubenswrapper[4792]: I0309 10:22:08.734973 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/extract-utilities/0.log" Mar 09 10:22:08 crc kubenswrapper[4792]: I0309 10:22:08.741656 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/extract-content/0.log" Mar 09 10:22:08 crc kubenswrapper[4792]: I0309 10:22:08.747006 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/extract-content/0.log" Mar 09 10:22:08 crc kubenswrapper[4792]: I0309 10:22:08.876119 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/extract-content/0.log" Mar 09 10:22:08 crc kubenswrapper[4792]: I0309 10:22:08.912205 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/extract-utilities/0.log" Mar 09 10:22:09 crc kubenswrapper[4792]: I0309 10:22:09.446052 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/registry-server/0.log" Mar 09 10:22:09 crc kubenswrapper[4792]: I0309 10:22:09.663204 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:22:09 crc kubenswrapper[4792]: E0309 10:22:09.663904 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:22:23 crc kubenswrapper[4792]: I0309 10:22:23.662183 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:22:23 crc kubenswrapper[4792]: E0309 10:22:23.663013 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:22:36 crc kubenswrapper[4792]: I0309 10:22:36.668843 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:22:36 crc kubenswrapper[4792]: E0309 10:22:36.669448 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:22:44 crc kubenswrapper[4792]: I0309 10:22:44.580701 4792 scope.go:117] "RemoveContainer" containerID="1076fd99612e0f315ccb8d8229e73fe18c5f4c323e9c2b4fa31c11b62715a7b8" Mar 09 10:22:51 crc kubenswrapper[4792]: I0309 10:22:51.668003 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:22:51 crc kubenswrapper[4792]: E0309 10:22:51.669600 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:23:04 crc kubenswrapper[4792]: I0309 10:23:04.662633 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:23:04 crc kubenswrapper[4792]: E0309 10:23:04.663467 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:23:19 crc kubenswrapper[4792]: I0309 10:23:19.667641 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:23:19 crc kubenswrapper[4792]: E0309 10:23:19.668455 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:23:23 crc kubenswrapper[4792]: I0309 10:23:23.993801 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q4mwj"] Mar 09 10:23:23 crc kubenswrapper[4792]: E0309 10:23:23.994673 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4d06a7-d4c8-4ac1-ba1b-805062fe1835" containerName="oc" Mar 09 10:23:23 crc kubenswrapper[4792]: I0309 10:23:23.994689 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4d06a7-d4c8-4ac1-ba1b-805062fe1835" containerName="oc" Mar 09 10:23:23 crc kubenswrapper[4792]: I0309 10:23:23.994890 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd4d06a7-d4c8-4ac1-ba1b-805062fe1835" containerName="oc" Mar 09 10:23:23 crc kubenswrapper[4792]: I0309 10:23:23.997294 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:23:24 crc kubenswrapper[4792]: I0309 10:23:24.017845 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q4mwj"] Mar 09 10:23:24 crc kubenswrapper[4792]: I0309 10:23:24.035972 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c382d4cd-8ad0-4fab-8f15-523e0e36638e-utilities\") pod \"redhat-operators-q4mwj\" (UID: \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\") " pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:23:24 crc kubenswrapper[4792]: I0309 10:23:24.039281 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c382d4cd-8ad0-4fab-8f15-523e0e36638e-catalog-content\") pod \"redhat-operators-q4mwj\" (UID: \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\") " pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:23:24 crc kubenswrapper[4792]: I0309 10:23:24.039690 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlhs7\" (UniqueName: \"kubernetes.io/projected/c382d4cd-8ad0-4fab-8f15-523e0e36638e-kube-api-access-qlhs7\") pod \"redhat-operators-q4mwj\" (UID: \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\") " pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:23:24 crc kubenswrapper[4792]: I0309 10:23:24.141103 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c382d4cd-8ad0-4fab-8f15-523e0e36638e-utilities\") pod \"redhat-operators-q4mwj\" (UID: \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\") " pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:23:24 crc kubenswrapper[4792]: I0309 10:23:24.141187 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c382d4cd-8ad0-4fab-8f15-523e0e36638e-catalog-content\") pod \"redhat-operators-q4mwj\" (UID: \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\") " pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:23:24 crc kubenswrapper[4792]: I0309 10:23:24.141319 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlhs7\" (UniqueName: \"kubernetes.io/projected/c382d4cd-8ad0-4fab-8f15-523e0e36638e-kube-api-access-qlhs7\") pod \"redhat-operators-q4mwj\" (UID: \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\") " pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:23:24 crc kubenswrapper[4792]: I0309 10:23:24.141617 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c382d4cd-8ad0-4fab-8f15-523e0e36638e-utilities\") pod \"redhat-operators-q4mwj\" (UID: \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\") " pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:23:24 crc kubenswrapper[4792]: I0309 10:23:24.141673 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c382d4cd-8ad0-4fab-8f15-523e0e36638e-catalog-content\") pod \"redhat-operators-q4mwj\" (UID: \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\") " pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:23:24 crc kubenswrapper[4792]: I0309 10:23:24.742755 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlhs7\" (UniqueName: \"kubernetes.io/projected/c382d4cd-8ad0-4fab-8f15-523e0e36638e-kube-api-access-qlhs7\") pod \"redhat-operators-q4mwj\" (UID: \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\") " pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:23:24 crc kubenswrapper[4792]: I0309 10:23:24.923925 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:23:25 crc kubenswrapper[4792]: I0309 10:23:25.609545 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q4mwj"] Mar 09 10:23:25 crc kubenswrapper[4792]: W0309 10:23:25.628772 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc382d4cd_8ad0_4fab_8f15_523e0e36638e.slice/crio-78e3abb3351bfb19a8573ddab0b4c9281e622d0484e7c4976d6cc933fac31a05 WatchSource:0}: Error finding container 78e3abb3351bfb19a8573ddab0b4c9281e622d0484e7c4976d6cc933fac31a05: Status 404 returned error can't find the container with id 78e3abb3351bfb19a8573ddab0b4c9281e622d0484e7c4976d6cc933fac31a05 Mar 09 10:23:26 crc kubenswrapper[4792]: I0309 10:23:26.423625 4792 generic.go:334] "Generic (PLEG): container finished" podID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" containerID="6f4fff100e0326d1866b65d53d4c6dc3576650f41ee70b022fda57cb92e648a4" exitCode=0 Mar 09 10:23:26 crc kubenswrapper[4792]: I0309 10:23:26.423686 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4mwj" event={"ID":"c382d4cd-8ad0-4fab-8f15-523e0e36638e","Type":"ContainerDied","Data":"6f4fff100e0326d1866b65d53d4c6dc3576650f41ee70b022fda57cb92e648a4"} Mar 09 10:23:26 crc kubenswrapper[4792]: I0309 10:23:26.424278 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4mwj" event={"ID":"c382d4cd-8ad0-4fab-8f15-523e0e36638e","Type":"ContainerStarted","Data":"78e3abb3351bfb19a8573ddab0b4c9281e622d0484e7c4976d6cc933fac31a05"} Mar 09 10:23:28 crc kubenswrapper[4792]: I0309 10:23:28.442864 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4mwj" event={"ID":"c382d4cd-8ad0-4fab-8f15-523e0e36638e","Type":"ContainerStarted","Data":"cab6ec438469cd30ce0380dc957ce5077bd4df86c1fa3362c390ea11b27595b8"} Mar 09 10:23:34 crc kubenswrapper[4792]: I0309 10:23:34.662686 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:23:34 crc kubenswrapper[4792]: E0309 10:23:34.663590 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:23:35 crc kubenswrapper[4792]: I0309 10:23:35.510176 4792 generic.go:334] "Generic (PLEG): container finished" podID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" containerID="cab6ec438469cd30ce0380dc957ce5077bd4df86c1fa3362c390ea11b27595b8" exitCode=0 Mar 09 10:23:35 crc kubenswrapper[4792]: I0309 10:23:35.510204 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4mwj" event={"ID":"c382d4cd-8ad0-4fab-8f15-523e0e36638e","Type":"ContainerDied","Data":"cab6ec438469cd30ce0380dc957ce5077bd4df86c1fa3362c390ea11b27595b8"} Mar 09 10:23:36 crc kubenswrapper[4792]: I0309 10:23:36.520881 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4mwj" event={"ID":"c382d4cd-8ad0-4fab-8f15-523e0e36638e","Type":"ContainerStarted","Data":"016957a3ecf7a205a943c3bee84974d1cb037083575e8db87cdfce4b1ea74155"} Mar 09 10:23:36 crc kubenswrapper[4792]: I0309 10:23:36.551058 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q4mwj" podStartSLOduration=4.011572569 podStartE2EDuration="13.551037061s" podCreationTimestamp="2026-03-09 10:23:23 +0000 UTC" firstStartedPulling="2026-03-09 10:23:26.426663459 +0000 UTC m=+4571.456864211" lastFinishedPulling="2026-03-09 10:23:35.966127951 +0000 UTC m=+4580.996328703" observedRunningTime="2026-03-09 10:23:36.54212689 +0000 UTC m=+4581.572327652" watchObservedRunningTime="2026-03-09 10:23:36.551037061 +0000 UTC m=+4581.581237813" Mar 09 10:23:44 crc kubenswrapper[4792]: I0309 10:23:44.739794 4792 scope.go:117] "RemoveContainer" containerID="d13f7cebd9156705f54933def6682923998bc769753aff63fccbd36852f9fa63" Mar 09 10:23:44 crc kubenswrapper[4792]: I0309 10:23:44.924695 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:23:44 crc kubenswrapper[4792]: I0309 10:23:44.924732 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:23:46 crc kubenswrapper[4792]: I0309 10:23:46.373610 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q4mwj" podUID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" containerName="registry-server" probeResult="failure" output=< Mar 09 10:23:46 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:23:46 crc kubenswrapper[4792]: > Mar 09 10:23:47 crc kubenswrapper[4792]: I0309 10:23:47.663014 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:23:47 crc kubenswrapper[4792]: E0309 10:23:47.663556 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:23:55 crc kubenswrapper[4792]: I0309 10:23:55.991271 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q4mwj" podUID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" containerName="registry-server" probeResult="failure" output=< Mar 09 10:23:55 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:23:55 crc kubenswrapper[4792]: > Mar 09 10:24:00 crc kubenswrapper[4792]: I0309 10:24:00.156484 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550864-m5bjn"] Mar 09 10:24:00 crc kubenswrapper[4792]: I0309 10:24:00.158690 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550864-m5bjn" Mar 09 10:24:00 crc kubenswrapper[4792]: I0309 10:24:00.160303 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:24:00 crc kubenswrapper[4792]: I0309 10:24:00.162184 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:24:00 crc kubenswrapper[4792]: I0309 10:24:00.162634 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:24:00 crc kubenswrapper[4792]: I0309 10:24:00.189659 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550864-m5bjn"] Mar 09 10:24:00 crc kubenswrapper[4792]: I0309 10:24:00.323563 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj5w8\" (UniqueName: \"kubernetes.io/projected/4ef44af8-2b43-4cf4-8cf6-ce645992e33d-kube-api-access-pj5w8\") pod \"auto-csr-approver-29550864-m5bjn\" (UID: \"4ef44af8-2b43-4cf4-8cf6-ce645992e33d\") " pod="openshift-infra/auto-csr-approver-29550864-m5bjn" Mar 09 10:24:00 crc kubenswrapper[4792]: I0309 10:24:00.425166 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj5w8\" (UniqueName: \"kubernetes.io/projected/4ef44af8-2b43-4cf4-8cf6-ce645992e33d-kube-api-access-pj5w8\") pod \"auto-csr-approver-29550864-m5bjn\" (UID: \"4ef44af8-2b43-4cf4-8cf6-ce645992e33d\") " pod="openshift-infra/auto-csr-approver-29550864-m5bjn" Mar 09 10:24:00 crc kubenswrapper[4792]: I0309 10:24:00.447414 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj5w8\" (UniqueName: \"kubernetes.io/projected/4ef44af8-2b43-4cf4-8cf6-ce645992e33d-kube-api-access-pj5w8\") pod \"auto-csr-approver-29550864-m5bjn\" (UID: \"4ef44af8-2b43-4cf4-8cf6-ce645992e33d\") " pod="openshift-infra/auto-csr-approver-29550864-m5bjn" Mar 09 10:24:00 crc kubenswrapper[4792]: I0309 10:24:00.495519 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550864-m5bjn" Mar 09 10:24:01 crc kubenswrapper[4792]: I0309 10:24:01.055159 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550864-m5bjn"] Mar 09 10:24:01 crc kubenswrapper[4792]: I0309 10:24:01.779634 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550864-m5bjn" event={"ID":"4ef44af8-2b43-4cf4-8cf6-ce645992e33d","Type":"ContainerStarted","Data":"d9d7f330fd69a642c05caeff7d1e9b6689e40e99fded1d24eab9f7c7c398a3bb"} Mar 09 10:24:02 crc kubenswrapper[4792]: I0309 10:24:02.662942 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:24:02 crc kubenswrapper[4792]: E0309 10:24:02.663988 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:24:02 crc kubenswrapper[4792]: I0309 10:24:02.794856 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550864-m5bjn" event={"ID":"4ef44af8-2b43-4cf4-8cf6-ce645992e33d","Type":"ContainerStarted","Data":"f6c982ddc371d3eabee8ddf084483a37a75db0598d6f15ca51a1770e356e7f7d"} Mar 09 10:24:02 crc kubenswrapper[4792]: I0309 10:24:02.811518 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550864-m5bjn" podStartSLOduration=1.9658818569999998 podStartE2EDuration="2.81149429s" podCreationTimestamp="2026-03-09 10:24:00 +0000 UTC" firstStartedPulling="2026-03-09 10:24:01.051960184 +0000 UTC m=+4606.082160936" lastFinishedPulling="2026-03-09 10:24:01.897572607 +0000 UTC m=+4606.927773369" observedRunningTime="2026-03-09 10:24:02.809134236 +0000 UTC m=+4607.839334998" watchObservedRunningTime="2026-03-09 10:24:02.81149429 +0000 UTC m=+4607.841695042" Mar 09 10:24:03 crc kubenswrapper[4792]: I0309 10:24:03.805256 4792 generic.go:334] "Generic (PLEG): container finished" podID="4ef44af8-2b43-4cf4-8cf6-ce645992e33d" containerID="f6c982ddc371d3eabee8ddf084483a37a75db0598d6f15ca51a1770e356e7f7d" exitCode=0 Mar 09 10:24:03 crc kubenswrapper[4792]: I0309 10:24:03.805610 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550864-m5bjn" event={"ID":"4ef44af8-2b43-4cf4-8cf6-ce645992e33d","Type":"ContainerDied","Data":"f6c982ddc371d3eabee8ddf084483a37a75db0598d6f15ca51a1770e356e7f7d"} Mar 09 10:24:05 crc kubenswrapper[4792]: I0309 10:24:05.414993 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:24:05 crc kubenswrapper[4792]: I0309 10:24:05.462974 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550864-m5bjn" Mar 09 10:24:05 crc kubenswrapper[4792]: I0309 10:24:05.527591 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:24:05 crc kubenswrapper[4792]: I0309 10:24:05.531212 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj5w8\" (UniqueName: \"kubernetes.io/projected/4ef44af8-2b43-4cf4-8cf6-ce645992e33d-kube-api-access-pj5w8\") pod \"4ef44af8-2b43-4cf4-8cf6-ce645992e33d\" (UID: \"4ef44af8-2b43-4cf4-8cf6-ce645992e33d\") " Mar 09 10:24:05 crc kubenswrapper[4792]: I0309 10:24:05.571357 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef44af8-2b43-4cf4-8cf6-ce645992e33d-kube-api-access-pj5w8" (OuterVolumeSpecName: "kube-api-access-pj5w8") pod "4ef44af8-2b43-4cf4-8cf6-ce645992e33d" (UID: "4ef44af8-2b43-4cf4-8cf6-ce645992e33d"). InnerVolumeSpecName "kube-api-access-pj5w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:24:05 crc kubenswrapper[4792]: I0309 10:24:05.633754 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj5w8\" (UniqueName: \"kubernetes.io/projected/4ef44af8-2b43-4cf4-8cf6-ce645992e33d-kube-api-access-pj5w8\") on node \"crc\" DevicePath \"\"" Mar 09 10:24:05 crc kubenswrapper[4792]: I0309 10:24:05.690375 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q4mwj"] Mar 09 10:24:05 crc kubenswrapper[4792]: I0309 10:24:05.825159 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550864-m5bjn" event={"ID":"4ef44af8-2b43-4cf4-8cf6-ce645992e33d","Type":"ContainerDied","Data":"d9d7f330fd69a642c05caeff7d1e9b6689e40e99fded1d24eab9f7c7c398a3bb"} Mar 09 10:24:05 crc kubenswrapper[4792]: I0309 10:24:05.825207 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9d7f330fd69a642c05caeff7d1e9b6689e40e99fded1d24eab9f7c7c398a3bb" Mar 09 10:24:05 crc kubenswrapper[4792]: I0309 10:24:05.825267 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550864-m5bjn" Mar 09 10:24:05 crc kubenswrapper[4792]: I0309 10:24:05.963125 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550858-zpmfc"] Mar 09 10:24:05 crc kubenswrapper[4792]: I0309 10:24:05.976408 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550858-zpmfc"] Mar 09 10:24:06 crc kubenswrapper[4792]: I0309 10:24:06.832970 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q4mwj" podUID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" containerName="registry-server" containerID="cri-o://016957a3ecf7a205a943c3bee84974d1cb037083575e8db87cdfce4b1ea74155" gracePeriod=2 Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.390244 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.466310 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c382d4cd-8ad0-4fab-8f15-523e0e36638e-catalog-content\") pod \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\" (UID: \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\") " Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.466366 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlhs7\" (UniqueName: \"kubernetes.io/projected/c382d4cd-8ad0-4fab-8f15-523e0e36638e-kube-api-access-qlhs7\") pod \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\" (UID: \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\") " Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.466481 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c382d4cd-8ad0-4fab-8f15-523e0e36638e-utilities\") pod \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\" (UID: \"c382d4cd-8ad0-4fab-8f15-523e0e36638e\") " Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.467754 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c382d4cd-8ad0-4fab-8f15-523e0e36638e-utilities" (OuterVolumeSpecName: "utilities") pod "c382d4cd-8ad0-4fab-8f15-523e0e36638e" (UID: "c382d4cd-8ad0-4fab-8f15-523e0e36638e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.473809 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c382d4cd-8ad0-4fab-8f15-523e0e36638e-kube-api-access-qlhs7" (OuterVolumeSpecName: "kube-api-access-qlhs7") pod "c382d4cd-8ad0-4fab-8f15-523e0e36638e" (UID: "c382d4cd-8ad0-4fab-8f15-523e0e36638e"). InnerVolumeSpecName "kube-api-access-qlhs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.568789 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c382d4cd-8ad0-4fab-8f15-523e0e36638e-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.568833 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlhs7\" (UniqueName: \"kubernetes.io/projected/c382d4cd-8ad0-4fab-8f15-523e0e36638e-kube-api-access-qlhs7\") on node \"crc\" DevicePath \"\"" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.619872 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c382d4cd-8ad0-4fab-8f15-523e0e36638e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c382d4cd-8ad0-4fab-8f15-523e0e36638e" (UID: "c382d4cd-8ad0-4fab-8f15-523e0e36638e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.671740 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c382d4cd-8ad0-4fab-8f15-523e0e36638e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.676743 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c764a5a-2f88-47ce-8b71-6d9d64d3938e" path="/var/lib/kubelet/pods/9c764a5a-2f88-47ce-8b71-6d9d64d3938e/volumes" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.846029 4792 generic.go:334] "Generic (PLEG): container finished" podID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" containerID="016957a3ecf7a205a943c3bee84974d1cb037083575e8db87cdfce4b1ea74155" exitCode=0 Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.846082 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4mwj" event={"ID":"c382d4cd-8ad0-4fab-8f15-523e0e36638e","Type":"ContainerDied","Data":"016957a3ecf7a205a943c3bee84974d1cb037083575e8db87cdfce4b1ea74155"} Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.846113 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q4mwj" event={"ID":"c382d4cd-8ad0-4fab-8f15-523e0e36638e","Type":"ContainerDied","Data":"78e3abb3351bfb19a8573ddab0b4c9281e622d0484e7c4976d6cc933fac31a05"} Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.846136 4792 scope.go:117] "RemoveContainer" containerID="016957a3ecf7a205a943c3bee84974d1cb037083575e8db87cdfce4b1ea74155" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.846159 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q4mwj" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.877939 4792 scope.go:117] "RemoveContainer" containerID="cab6ec438469cd30ce0380dc957ce5077bd4df86c1fa3362c390ea11b27595b8" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.882507 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q4mwj"] Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.892233 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q4mwj"] Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.904046 4792 scope.go:117] "RemoveContainer" containerID="6f4fff100e0326d1866b65d53d4c6dc3576650f41ee70b022fda57cb92e648a4" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.954220 4792 scope.go:117] "RemoveContainer" containerID="016957a3ecf7a205a943c3bee84974d1cb037083575e8db87cdfce4b1ea74155" Mar 09 10:24:07 crc kubenswrapper[4792]: E0309 10:24:07.959509 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016957a3ecf7a205a943c3bee84974d1cb037083575e8db87cdfce4b1ea74155\": container with ID starting with 016957a3ecf7a205a943c3bee84974d1cb037083575e8db87cdfce4b1ea74155 not found: ID does not exist" containerID="016957a3ecf7a205a943c3bee84974d1cb037083575e8db87cdfce4b1ea74155" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.959569 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016957a3ecf7a205a943c3bee84974d1cb037083575e8db87cdfce4b1ea74155"} err="failed to get container status \"016957a3ecf7a205a943c3bee84974d1cb037083575e8db87cdfce4b1ea74155\": rpc error: code = NotFound desc = could not find container \"016957a3ecf7a205a943c3bee84974d1cb037083575e8db87cdfce4b1ea74155\": container with ID starting with 016957a3ecf7a205a943c3bee84974d1cb037083575e8db87cdfce4b1ea74155 not found: ID does not exist" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.959593 4792 scope.go:117] "RemoveContainer" containerID="cab6ec438469cd30ce0380dc957ce5077bd4df86c1fa3362c390ea11b27595b8" Mar 09 10:24:07 crc kubenswrapper[4792]: E0309 10:24:07.959871 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cab6ec438469cd30ce0380dc957ce5077bd4df86c1fa3362c390ea11b27595b8\": container with ID starting with cab6ec438469cd30ce0380dc957ce5077bd4df86c1fa3362c390ea11b27595b8 not found: ID does not exist" containerID="cab6ec438469cd30ce0380dc957ce5077bd4df86c1fa3362c390ea11b27595b8" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.959895 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cab6ec438469cd30ce0380dc957ce5077bd4df86c1fa3362c390ea11b27595b8"} err="failed to get container status \"cab6ec438469cd30ce0380dc957ce5077bd4df86c1fa3362c390ea11b27595b8\": rpc error: code = NotFound desc = could not find container \"cab6ec438469cd30ce0380dc957ce5077bd4df86c1fa3362c390ea11b27595b8\": container with ID starting with cab6ec438469cd30ce0380dc957ce5077bd4df86c1fa3362c390ea11b27595b8 not found: ID does not exist" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.959928 4792 scope.go:117] "RemoveContainer" containerID="6f4fff100e0326d1866b65d53d4c6dc3576650f41ee70b022fda57cb92e648a4" Mar 09 10:24:07 crc kubenswrapper[4792]: E0309 10:24:07.960240 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f4fff100e0326d1866b65d53d4c6dc3576650f41ee70b022fda57cb92e648a4\": container with ID starting with 6f4fff100e0326d1866b65d53d4c6dc3576650f41ee70b022fda57cb92e648a4 not found: ID does not exist" containerID="6f4fff100e0326d1866b65d53d4c6dc3576650f41ee70b022fda57cb92e648a4" Mar 09 10:24:07 crc kubenswrapper[4792]: I0309 10:24:07.960356 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f4fff100e0326d1866b65d53d4c6dc3576650f41ee70b022fda57cb92e648a4"} err="failed to get container status \"6f4fff100e0326d1866b65d53d4c6dc3576650f41ee70b022fda57cb92e648a4\": rpc error: code = NotFound desc = could not find container \"6f4fff100e0326d1866b65d53d4c6dc3576650f41ee70b022fda57cb92e648a4\": container with ID starting with 6f4fff100e0326d1866b65d53d4c6dc3576650f41ee70b022fda57cb92e648a4 not found: ID does not exist" Mar 09 10:24:09 crc kubenswrapper[4792]: I0309 10:24:09.673697 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" path="/var/lib/kubelet/pods/c382d4cd-8ad0-4fab-8f15-523e0e36638e/volumes" Mar 09 10:24:14 crc kubenswrapper[4792]: I0309 10:24:14.663467 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:24:14 crc kubenswrapper[4792]: I0309 10:24:14.900619 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"6cb5e0b9a6c34e2e3e5787fefd658600bfed5b18ede479ce2bf2ac9c9b2e5218"} Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.547523 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dxwb7"] Mar 09 10:24:32 crc kubenswrapper[4792]: E0309 10:24:32.548584 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" containerName="extract-content" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.548601 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" containerName="extract-content" Mar 09 10:24:32 crc kubenswrapper[4792]: E0309 10:24:32.548619 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef44af8-2b43-4cf4-8cf6-ce645992e33d" containerName="oc" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.548627 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef44af8-2b43-4cf4-8cf6-ce645992e33d" containerName="oc" Mar 09 10:24:32 crc kubenswrapper[4792]: E0309 10:24:32.548661 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" containerName="registry-server" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.548671 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" containerName="registry-server" Mar 09 10:24:32 crc kubenswrapper[4792]: E0309 10:24:32.548688 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" containerName="extract-utilities" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.548695 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" containerName="extract-utilities" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.548895 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c382d4cd-8ad0-4fab-8f15-523e0e36638e" containerName="registry-server" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.548912 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef44af8-2b43-4cf4-8cf6-ce645992e33d" containerName="oc" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.550640 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.569698 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dxwb7"] Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.661747 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c60ace-1c28-4c14-b3d5-4064b84d0671-utilities\") pod \"certified-operators-dxwb7\" (UID: \"05c60ace-1c28-4c14-b3d5-4064b84d0671\") " pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.661821 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c60ace-1c28-4c14-b3d5-4064b84d0671-catalog-content\") pod \"certified-operators-dxwb7\" (UID: \"05c60ace-1c28-4c14-b3d5-4064b84d0671\") " pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.661933 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24ndv\" (UniqueName: \"kubernetes.io/projected/05c60ace-1c28-4c14-b3d5-4064b84d0671-kube-api-access-24ndv\") pod \"certified-operators-dxwb7\" (UID: \"05c60ace-1c28-4c14-b3d5-4064b84d0671\") " pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.763793 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c60ace-1c28-4c14-b3d5-4064b84d0671-utilities\") pod \"certified-operators-dxwb7\" (UID: \"05c60ace-1c28-4c14-b3d5-4064b84d0671\") " pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.763903 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c60ace-1c28-4c14-b3d5-4064b84d0671-catalog-content\") pod \"certified-operators-dxwb7\" (UID: \"05c60ace-1c28-4c14-b3d5-4064b84d0671\") " pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.764001 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24ndv\" (UniqueName: \"kubernetes.io/projected/05c60ace-1c28-4c14-b3d5-4064b84d0671-kube-api-access-24ndv\") pod \"certified-operators-dxwb7\" (UID: \"05c60ace-1c28-4c14-b3d5-4064b84d0671\") " pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.764850 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c60ace-1c28-4c14-b3d5-4064b84d0671-utilities\") pod \"certified-operators-dxwb7\" (UID: \"05c60ace-1c28-4c14-b3d5-4064b84d0671\") " pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.766382 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c60ace-1c28-4c14-b3d5-4064b84d0671-catalog-content\") pod \"certified-operators-dxwb7\" (UID: \"05c60ace-1c28-4c14-b3d5-4064b84d0671\") " pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.786911 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24ndv\" (UniqueName: \"kubernetes.io/projected/05c60ace-1c28-4c14-b3d5-4064b84d0671-kube-api-access-24ndv\") pod \"certified-operators-dxwb7\" (UID: \"05c60ace-1c28-4c14-b3d5-4064b84d0671\") " pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:32 crc kubenswrapper[4792]: I0309 10:24:32.873162 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:33 crc kubenswrapper[4792]: I0309 10:24:33.444738 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dxwb7"] Mar 09 10:24:34 crc kubenswrapper[4792]: I0309 10:24:34.105206 4792 generic.go:334] "Generic (PLEG): container finished" podID="05c60ace-1c28-4c14-b3d5-4064b84d0671" containerID="7b10946cdc9f6fc06c931d916c4ee882dbd44a8289ce0e782c9bcac39054fa0f" exitCode=0 Mar 09 10:24:34 crc kubenswrapper[4792]: I0309 10:24:34.105390 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxwb7" event={"ID":"05c60ace-1c28-4c14-b3d5-4064b84d0671","Type":"ContainerDied","Data":"7b10946cdc9f6fc06c931d916c4ee882dbd44a8289ce0e782c9bcac39054fa0f"} Mar 09 10:24:34 crc kubenswrapper[4792]: I0309 10:24:34.105528 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxwb7" event={"ID":"05c60ace-1c28-4c14-b3d5-4064b84d0671","Type":"ContainerStarted","Data":"10005bddd9bb8069292f7823d2ce865b96849fce4f02672525cd6eafe9b85362"} Mar 09 10:24:36 crc kubenswrapper[4792]: I0309 10:24:36.123738 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxwb7" event={"ID":"05c60ace-1c28-4c14-b3d5-4064b84d0671","Type":"ContainerStarted","Data":"1fc085a2b7f5ac7cfc0db2765f627d86e5b457339a9b9089af86a35ca43388df"} Mar 09 10:24:38 crc kubenswrapper[4792]: I0309 10:24:38.143890 4792 generic.go:334] "Generic (PLEG): container finished" podID="05c60ace-1c28-4c14-b3d5-4064b84d0671" containerID="1fc085a2b7f5ac7cfc0db2765f627d86e5b457339a9b9089af86a35ca43388df" exitCode=0 Mar 09 10:24:38 crc kubenswrapper[4792]: I0309 10:24:38.143962 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxwb7" event={"ID":"05c60ace-1c28-4c14-b3d5-4064b84d0671","Type":"ContainerDied","Data":"1fc085a2b7f5ac7cfc0db2765f627d86e5b457339a9b9089af86a35ca43388df"} Mar 09 10:24:39 crc kubenswrapper[4792]: I0309 10:24:39.154311 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxwb7" event={"ID":"05c60ace-1c28-4c14-b3d5-4064b84d0671","Type":"ContainerStarted","Data":"90055d547e0d30aa69d432081396ebdbb18dfd491d258313c8d755870e4a7386"} Mar 09 10:24:39 crc kubenswrapper[4792]: I0309 10:24:39.176399 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dxwb7" podStartSLOduration=2.507249394 podStartE2EDuration="7.176378103s" podCreationTimestamp="2026-03-09 10:24:32 +0000 UTC" firstStartedPulling="2026-03-09 10:24:34.107368624 +0000 UTC m=+4639.137569376" lastFinishedPulling="2026-03-09 10:24:38.776497333 +0000 UTC m=+4643.806698085" observedRunningTime="2026-03-09 10:24:39.168732278 +0000 UTC m=+4644.198933020" watchObservedRunningTime="2026-03-09 10:24:39.176378103 +0000 UTC m=+4644.206578855" Mar 09 10:24:42 crc kubenswrapper[4792]: I0309 10:24:42.873473 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:42 crc kubenswrapper[4792]: I0309 10:24:42.875164 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:42 crc kubenswrapper[4792]: I0309 10:24:42.922297 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:44 crc kubenswrapper[4792]: I0309 10:24:44.246314 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:44 crc kubenswrapper[4792]: I0309 10:24:44.297265 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dxwb7"] Mar 09 10:24:44 crc kubenswrapper[4792]: I0309 10:24:44.806608 4792 scope.go:117] "RemoveContainer" containerID="6b8a0fe162f772738010c7ea4c42f74a04db402148be803f3e5302e300eb15d3" Mar 09 10:24:46 crc kubenswrapper[4792]: I0309 10:24:46.222510 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dxwb7" podUID="05c60ace-1c28-4c14-b3d5-4064b84d0671" containerName="registry-server" containerID="cri-o://90055d547e0d30aa69d432081396ebdbb18dfd491d258313c8d755870e4a7386" gracePeriod=2 Mar 09 10:24:46 crc kubenswrapper[4792]: I0309 10:24:46.773964 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:46 crc kubenswrapper[4792]: I0309 10:24:46.892059 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c60ace-1c28-4c14-b3d5-4064b84d0671-catalog-content\") pod \"05c60ace-1c28-4c14-b3d5-4064b84d0671\" (UID: \"05c60ace-1c28-4c14-b3d5-4064b84d0671\") " Mar 09 10:24:46 crc kubenswrapper[4792]: I0309 10:24:46.892353 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24ndv\" (UniqueName: \"kubernetes.io/projected/05c60ace-1c28-4c14-b3d5-4064b84d0671-kube-api-access-24ndv\") pod \"05c60ace-1c28-4c14-b3d5-4064b84d0671\" (UID: \"05c60ace-1c28-4c14-b3d5-4064b84d0671\") " Mar 09 10:24:46 crc kubenswrapper[4792]: I0309 10:24:46.892410 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c60ace-1c28-4c14-b3d5-4064b84d0671-utilities\") pod \"05c60ace-1c28-4c14-b3d5-4064b84d0671\" (UID: \"05c60ace-1c28-4c14-b3d5-4064b84d0671\") " Mar 09 10:24:46 crc kubenswrapper[4792]: I0309 10:24:46.893272 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05c60ace-1c28-4c14-b3d5-4064b84d0671-utilities" (OuterVolumeSpecName: "utilities") pod "05c60ace-1c28-4c14-b3d5-4064b84d0671" (UID: "05c60ace-1c28-4c14-b3d5-4064b84d0671"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:24:46 crc kubenswrapper[4792]: I0309 10:24:46.894109 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05c60ace-1c28-4c14-b3d5-4064b84d0671-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:24:46 crc kubenswrapper[4792]: I0309 10:24:46.901233 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05c60ace-1c28-4c14-b3d5-4064b84d0671-kube-api-access-24ndv" (OuterVolumeSpecName: "kube-api-access-24ndv") pod "05c60ace-1c28-4c14-b3d5-4064b84d0671" (UID: "05c60ace-1c28-4c14-b3d5-4064b84d0671"). InnerVolumeSpecName "kube-api-access-24ndv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:24:46 crc kubenswrapper[4792]: I0309 10:24:46.964084 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05c60ace-1c28-4c14-b3d5-4064b84d0671-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05c60ace-1c28-4c14-b3d5-4064b84d0671" (UID: "05c60ace-1c28-4c14-b3d5-4064b84d0671"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:24:46 crc kubenswrapper[4792]: I0309 10:24:46.996461 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05c60ace-1c28-4c14-b3d5-4064b84d0671-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:24:46 crc kubenswrapper[4792]: I0309 10:24:46.996502 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24ndv\" (UniqueName: \"kubernetes.io/projected/05c60ace-1c28-4c14-b3d5-4064b84d0671-kube-api-access-24ndv\") on node \"crc\" DevicePath \"\"" Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.236101 4792 generic.go:334] "Generic (PLEG): container finished" podID="05c60ace-1c28-4c14-b3d5-4064b84d0671" containerID="90055d547e0d30aa69d432081396ebdbb18dfd491d258313c8d755870e4a7386" exitCode=0 Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.236265 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxwb7" event={"ID":"05c60ace-1c28-4c14-b3d5-4064b84d0671","Type":"ContainerDied","Data":"90055d547e0d30aa69d432081396ebdbb18dfd491d258313c8d755870e4a7386"} Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.236468 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxwb7" event={"ID":"05c60ace-1c28-4c14-b3d5-4064b84d0671","Type":"ContainerDied","Data":"10005bddd9bb8069292f7823d2ce865b96849fce4f02672525cd6eafe9b85362"} Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.236492 4792 scope.go:117] "RemoveContainer" containerID="90055d547e0d30aa69d432081396ebdbb18dfd491d258313c8d755870e4a7386" Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.236364 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxwb7" Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.277705 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dxwb7"] Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.280704 4792 scope.go:117] "RemoveContainer" containerID="1fc085a2b7f5ac7cfc0db2765f627d86e5b457339a9b9089af86a35ca43388df" Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.286773 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dxwb7"] Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.303603 4792 scope.go:117] "RemoveContainer" containerID="7b10946cdc9f6fc06c931d916c4ee882dbd44a8289ce0e782c9bcac39054fa0f" Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.357286 4792 scope.go:117] "RemoveContainer" containerID="90055d547e0d30aa69d432081396ebdbb18dfd491d258313c8d755870e4a7386" Mar 09 10:24:47 crc kubenswrapper[4792]: E0309 10:24:47.358779 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90055d547e0d30aa69d432081396ebdbb18dfd491d258313c8d755870e4a7386\": container with ID starting with 90055d547e0d30aa69d432081396ebdbb18dfd491d258313c8d755870e4a7386 not found: ID does not exist" containerID="90055d547e0d30aa69d432081396ebdbb18dfd491d258313c8d755870e4a7386" Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.358815 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90055d547e0d30aa69d432081396ebdbb18dfd491d258313c8d755870e4a7386"} err="failed to get container status \"90055d547e0d30aa69d432081396ebdbb18dfd491d258313c8d755870e4a7386\": rpc error: code = NotFound desc = could not find container \"90055d547e0d30aa69d432081396ebdbb18dfd491d258313c8d755870e4a7386\": container with ID starting with 90055d547e0d30aa69d432081396ebdbb18dfd491d258313c8d755870e4a7386 not found: ID does not exist" Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.358841 4792 scope.go:117] "RemoveContainer" containerID="1fc085a2b7f5ac7cfc0db2765f627d86e5b457339a9b9089af86a35ca43388df" Mar 09 10:24:47 crc kubenswrapper[4792]: E0309 10:24:47.359346 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fc085a2b7f5ac7cfc0db2765f627d86e5b457339a9b9089af86a35ca43388df\": container with ID starting with 1fc085a2b7f5ac7cfc0db2765f627d86e5b457339a9b9089af86a35ca43388df not found: ID does not exist" containerID="1fc085a2b7f5ac7cfc0db2765f627d86e5b457339a9b9089af86a35ca43388df" Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.359369 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc085a2b7f5ac7cfc0db2765f627d86e5b457339a9b9089af86a35ca43388df"} err="failed to get container status \"1fc085a2b7f5ac7cfc0db2765f627d86e5b457339a9b9089af86a35ca43388df\": rpc error: code = NotFound desc = could not find container \"1fc085a2b7f5ac7cfc0db2765f627d86e5b457339a9b9089af86a35ca43388df\": container with ID starting with 1fc085a2b7f5ac7cfc0db2765f627d86e5b457339a9b9089af86a35ca43388df not found: ID does not exist" Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.359386 4792 scope.go:117] "RemoveContainer" containerID="7b10946cdc9f6fc06c931d916c4ee882dbd44a8289ce0e782c9bcac39054fa0f" Mar 09 10:24:47 crc kubenswrapper[4792]: E0309 10:24:47.359672 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b10946cdc9f6fc06c931d916c4ee882dbd44a8289ce0e782c9bcac39054fa0f\": container with ID starting with 7b10946cdc9f6fc06c931d916c4ee882dbd44a8289ce0e782c9bcac39054fa0f not found: ID does not exist" containerID="7b10946cdc9f6fc06c931d916c4ee882dbd44a8289ce0e782c9bcac39054fa0f" Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.359697 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b10946cdc9f6fc06c931d916c4ee882dbd44a8289ce0e782c9bcac39054fa0f"} err="failed to get container status \"7b10946cdc9f6fc06c931d916c4ee882dbd44a8289ce0e782c9bcac39054fa0f\": rpc error: code = NotFound desc = could not find container \"7b10946cdc9f6fc06c931d916c4ee882dbd44a8289ce0e782c9bcac39054fa0f\": container with ID starting with 7b10946cdc9f6fc06c931d916c4ee882dbd44a8289ce0e782c9bcac39054fa0f not found: ID does not exist" Mar 09 10:24:47 crc kubenswrapper[4792]: I0309 10:24:47.675359 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05c60ace-1c28-4c14-b3d5-4064b84d0671" path="/var/lib/kubelet/pods/05c60ace-1c28-4c14-b3d5-4064b84d0671/volumes" Mar 09 10:24:58 crc kubenswrapper[4792]: I0309 10:24:58.335995 4792 generic.go:334] "Generic (PLEG): container finished" podID="0b3707b4-8ae3-481c-9a67-45f24402244f" containerID="e37b7e52b94d24c0021aafab46dbee8905bfea81d15b7477b29b149aaacc12a0" exitCode=0 Mar 09 10:24:58 crc kubenswrapper[4792]: I0309 10:24:58.336061 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qmm66/must-gather-z6zwb" event={"ID":"0b3707b4-8ae3-481c-9a67-45f24402244f","Type":"ContainerDied","Data":"e37b7e52b94d24c0021aafab46dbee8905bfea81d15b7477b29b149aaacc12a0"} Mar 09 10:24:58 crc kubenswrapper[4792]: I0309 10:24:58.336855 4792 scope.go:117] "RemoveContainer" containerID="e37b7e52b94d24c0021aafab46dbee8905bfea81d15b7477b29b149aaacc12a0" Mar 09 10:24:58 crc kubenswrapper[4792]: I0309 10:24:58.955018 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qmm66_must-gather-z6zwb_0b3707b4-8ae3-481c-9a67-45f24402244f/gather/0.log" Mar 09 10:25:08 crc kubenswrapper[4792]: I0309 10:25:08.933095 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qmm66/must-gather-z6zwb"] Mar 09 10:25:08 crc kubenswrapper[4792]: I0309 10:25:08.933850 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qmm66/must-gather-z6zwb" podUID="0b3707b4-8ae3-481c-9a67-45f24402244f" containerName="copy" containerID="cri-o://779a897a2032f14fbe8de77778c56e342d4972365fce59925db9305dc238f0ac" gracePeriod=2 Mar 09 10:25:08 crc kubenswrapper[4792]: I0309 10:25:08.951778 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qmm66/must-gather-z6zwb"] Mar 09 10:25:09 crc kubenswrapper[4792]: E0309 10:25:09.080911 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b3707b4_8ae3_481c_9a67_45f24402244f.slice/crio-conmon-779a897a2032f14fbe8de77778c56e342d4972365fce59925db9305dc238f0ac.scope\": RecentStats: unable to find data in memory cache]" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.407251 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qmm66_must-gather-z6zwb_0b3707b4-8ae3-481c-9a67-45f24402244f/copy/0.log" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.407900 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/must-gather-z6zwb" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.577206 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0b3707b4-8ae3-481c-9a67-45f24402244f-must-gather-output\") pod \"0b3707b4-8ae3-481c-9a67-45f24402244f\" (UID: \"0b3707b4-8ae3-481c-9a67-45f24402244f\") " Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.577434 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4knq7\" (UniqueName: \"kubernetes.io/projected/0b3707b4-8ae3-481c-9a67-45f24402244f-kube-api-access-4knq7\") pod \"0b3707b4-8ae3-481c-9a67-45f24402244f\" (UID: \"0b3707b4-8ae3-481c-9a67-45f24402244f\") " Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.584001 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3707b4-8ae3-481c-9a67-45f24402244f-kube-api-access-4knq7" (OuterVolumeSpecName: "kube-api-access-4knq7") pod "0b3707b4-8ae3-481c-9a67-45f24402244f" (UID: "0b3707b4-8ae3-481c-9a67-45f24402244f"). InnerVolumeSpecName "kube-api-access-4knq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.680593 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4knq7\" (UniqueName: \"kubernetes.io/projected/0b3707b4-8ae3-481c-9a67-45f24402244f-kube-api-access-4knq7\") on node \"crc\" DevicePath \"\"" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.756695 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b3707b4-8ae3-481c-9a67-45f24402244f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0b3707b4-8ae3-481c-9a67-45f24402244f" (UID: "0b3707b4-8ae3-481c-9a67-45f24402244f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.782386 4792 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0b3707b4-8ae3-481c-9a67-45f24402244f-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.819125 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qmm66_must-gather-z6zwb_0b3707b4-8ae3-481c-9a67-45f24402244f/copy/0.log" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.819487 4792 generic.go:334] "Generic (PLEG): container finished" podID="0b3707b4-8ae3-481c-9a67-45f24402244f" containerID="779a897a2032f14fbe8de77778c56e342d4972365fce59925db9305dc238f0ac" exitCode=143 Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.819524 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qmm66/must-gather-z6zwb" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.819570 4792 scope.go:117] "RemoveContainer" containerID="779a897a2032f14fbe8de77778c56e342d4972365fce59925db9305dc238f0ac" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.846580 4792 scope.go:117] "RemoveContainer" containerID="e37b7e52b94d24c0021aafab46dbee8905bfea81d15b7477b29b149aaacc12a0" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.888803 4792 scope.go:117] "RemoveContainer" containerID="779a897a2032f14fbe8de77778c56e342d4972365fce59925db9305dc238f0ac" Mar 09 10:25:09 crc kubenswrapper[4792]: E0309 10:25:09.889518 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"779a897a2032f14fbe8de77778c56e342d4972365fce59925db9305dc238f0ac\": container with ID starting with 779a897a2032f14fbe8de77778c56e342d4972365fce59925db9305dc238f0ac not found: ID does not exist" containerID="779a897a2032f14fbe8de77778c56e342d4972365fce59925db9305dc238f0ac" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.889563 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779a897a2032f14fbe8de77778c56e342d4972365fce59925db9305dc238f0ac"} err="failed to get container status \"779a897a2032f14fbe8de77778c56e342d4972365fce59925db9305dc238f0ac\": rpc error: code = NotFound desc = could not find container \"779a897a2032f14fbe8de77778c56e342d4972365fce59925db9305dc238f0ac\": container with ID starting with 779a897a2032f14fbe8de77778c56e342d4972365fce59925db9305dc238f0ac not found: ID does not exist" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.889582 4792 scope.go:117] "RemoveContainer" containerID="e37b7e52b94d24c0021aafab46dbee8905bfea81d15b7477b29b149aaacc12a0" Mar 09 10:25:09 crc kubenswrapper[4792]: E0309 10:25:09.889990 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37b7e52b94d24c0021aafab46dbee8905bfea81d15b7477b29b149aaacc12a0\": container with ID starting with e37b7e52b94d24c0021aafab46dbee8905bfea81d15b7477b29b149aaacc12a0 not found: ID does not exist" containerID="e37b7e52b94d24c0021aafab46dbee8905bfea81d15b7477b29b149aaacc12a0" Mar 09 10:25:09 crc kubenswrapper[4792]: I0309 10:25:09.890032 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37b7e52b94d24c0021aafab46dbee8905bfea81d15b7477b29b149aaacc12a0"} err="failed to get container status \"e37b7e52b94d24c0021aafab46dbee8905bfea81d15b7477b29b149aaacc12a0\": rpc error: code = NotFound desc = could not find container \"e37b7e52b94d24c0021aafab46dbee8905bfea81d15b7477b29b149aaacc12a0\": container with ID starting with e37b7e52b94d24c0021aafab46dbee8905bfea81d15b7477b29b149aaacc12a0 not found: ID does not exist" Mar 09 10:25:11 crc kubenswrapper[4792]: I0309 10:25:11.675533 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b3707b4-8ae3-481c-9a67-45f24402244f" path="/var/lib/kubelet/pods/0b3707b4-8ae3-481c-9a67-45f24402244f/volumes" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.146312 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550866-xvz8b"] Mar 09 10:26:00 crc kubenswrapper[4792]: E0309 10:26:00.147186 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c60ace-1c28-4c14-b3d5-4064b84d0671" containerName="extract-content" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.147202 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c60ace-1c28-4c14-b3d5-4064b84d0671" containerName="extract-content" Mar 09 10:26:00 crc kubenswrapper[4792]: E0309 10:26:00.147225 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c60ace-1c28-4c14-b3d5-4064b84d0671" containerName="registry-server" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.147234 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c60ace-1c28-4c14-b3d5-4064b84d0671" containerName="registry-server" Mar 09 10:26:00 crc kubenswrapper[4792]: E0309 10:26:00.147255 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3707b4-8ae3-481c-9a67-45f24402244f" containerName="gather" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.147275 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3707b4-8ae3-481c-9a67-45f24402244f" containerName="gather" Mar 09 10:26:00 crc kubenswrapper[4792]: E0309 10:26:00.147298 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05c60ace-1c28-4c14-b3d5-4064b84d0671" containerName="extract-utilities" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.147305 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="05c60ace-1c28-4c14-b3d5-4064b84d0671" containerName="extract-utilities" Mar 09 10:26:00 crc kubenswrapper[4792]: E0309 10:26:00.147320 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3707b4-8ae3-481c-9a67-45f24402244f" containerName="copy" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.147327 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3707b4-8ae3-481c-9a67-45f24402244f" containerName="copy" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.147530 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3707b4-8ae3-481c-9a67-45f24402244f" containerName="copy" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.147559 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3707b4-8ae3-481c-9a67-45f24402244f" containerName="gather" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.147572 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="05c60ace-1c28-4c14-b3d5-4064b84d0671" containerName="registry-server" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.148347 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550866-xvz8b" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.155060 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.155594 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.155824 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.156125 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550866-xvz8b"] Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.156616 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mc2t\" (UniqueName: \"kubernetes.io/projected/ecee766e-be60-4c51-a24e-88dccfa0c460-kube-api-access-8mc2t\") pod \"auto-csr-approver-29550866-xvz8b\" (UID: \"ecee766e-be60-4c51-a24e-88dccfa0c460\") " pod="openshift-infra/auto-csr-approver-29550866-xvz8b" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.258172 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mc2t\" (UniqueName: \"kubernetes.io/projected/ecee766e-be60-4c51-a24e-88dccfa0c460-kube-api-access-8mc2t\") pod \"auto-csr-approver-29550866-xvz8b\" (UID: \"ecee766e-be60-4c51-a24e-88dccfa0c460\") " pod="openshift-infra/auto-csr-approver-29550866-xvz8b" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.294412 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mc2t\" (UniqueName: \"kubernetes.io/projected/ecee766e-be60-4c51-a24e-88dccfa0c460-kube-api-access-8mc2t\") pod \"auto-csr-approver-29550866-xvz8b\" (UID: \"ecee766e-be60-4c51-a24e-88dccfa0c460\") " pod="openshift-infra/auto-csr-approver-29550866-xvz8b" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.468883 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550866-xvz8b" Mar 09 10:26:00 crc kubenswrapper[4792]: I0309 10:26:00.956162 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550866-xvz8b"] Mar 09 10:26:01 crc kubenswrapper[4792]: I0309 10:26:01.246489 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550866-xvz8b" event={"ID":"ecee766e-be60-4c51-a24e-88dccfa0c460","Type":"ContainerStarted","Data":"c3f49426ef81108fba2dcdac41923c93ab032191d857e41d52520becb7ca3fa7"} Mar 09 10:26:03 crc kubenswrapper[4792]: I0309 10:26:03.264001 4792 generic.go:334] "Generic (PLEG): container finished" podID="ecee766e-be60-4c51-a24e-88dccfa0c460" containerID="e925f019ff7525740362ab513ef045e02b3ccdf1c69383fd9402e922585067d5" exitCode=0 Mar 09 10:26:03 crc kubenswrapper[4792]: I0309 10:26:03.264114 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550866-xvz8b" event={"ID":"ecee766e-be60-4c51-a24e-88dccfa0c460","Type":"ContainerDied","Data":"e925f019ff7525740362ab513ef045e02b3ccdf1c69383fd9402e922585067d5"} Mar 09 10:26:04 crc kubenswrapper[4792]: I0309 10:26:04.587608 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550866-xvz8b" Mar 09 10:26:04 crc kubenswrapper[4792]: I0309 10:26:04.680270 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mc2t\" (UniqueName: \"kubernetes.io/projected/ecee766e-be60-4c51-a24e-88dccfa0c460-kube-api-access-8mc2t\") pod \"ecee766e-be60-4c51-a24e-88dccfa0c460\" (UID: \"ecee766e-be60-4c51-a24e-88dccfa0c460\") " Mar 09 10:26:04 crc kubenswrapper[4792]: I0309 10:26:04.701263 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecee766e-be60-4c51-a24e-88dccfa0c460-kube-api-access-8mc2t" (OuterVolumeSpecName: "kube-api-access-8mc2t") pod "ecee766e-be60-4c51-a24e-88dccfa0c460" (UID: "ecee766e-be60-4c51-a24e-88dccfa0c460"). InnerVolumeSpecName "kube-api-access-8mc2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:26:04 crc kubenswrapper[4792]: I0309 10:26:04.782655 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mc2t\" (UniqueName: \"kubernetes.io/projected/ecee766e-be60-4c51-a24e-88dccfa0c460-kube-api-access-8mc2t\") on node \"crc\" DevicePath \"\"" Mar 09 10:26:05 crc kubenswrapper[4792]: I0309 10:26:05.295493 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550866-xvz8b" event={"ID":"ecee766e-be60-4c51-a24e-88dccfa0c460","Type":"ContainerDied","Data":"c3f49426ef81108fba2dcdac41923c93ab032191d857e41d52520becb7ca3fa7"} Mar 09 10:26:05 crc kubenswrapper[4792]: I0309 10:26:05.296044 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3f49426ef81108fba2dcdac41923c93ab032191d857e41d52520becb7ca3fa7" Mar 09 10:26:05 crc kubenswrapper[4792]: I0309 10:26:05.295559 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550866-xvz8b" Mar 09 10:26:05 crc kubenswrapper[4792]: I0309 10:26:05.658994 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550860-mf978"] Mar 09 10:26:05 crc kubenswrapper[4792]: I0309 10:26:05.674515 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550860-mf978"] Mar 09 10:26:07 crc kubenswrapper[4792]: I0309 10:26:07.674292 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6306ab16-18db-41a2-97f5-4d56712f91c2" path="/var/lib/kubelet/pods/6306ab16-18db-41a2-97f5-4d56712f91c2/volumes" Mar 09 10:26:43 crc kubenswrapper[4792]: I0309 10:26:43.213909 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:26:43 crc kubenswrapper[4792]: I0309 10:26:43.214355 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:26:45 crc kubenswrapper[4792]: I0309 10:26:45.281766 4792 scope.go:117] "RemoveContainer" containerID="adb11d98d9f9d769b445e4926cbc174e35edbb3cf94a6cdc17240ec02b4807f7" Mar 09 10:27:07 crc kubenswrapper[4792]: I0309 10:27:07.828117 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mjlrt"] Mar 09 10:27:07 crc kubenswrapper[4792]: E0309 10:27:07.829047 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecee766e-be60-4c51-a24e-88dccfa0c460" containerName="oc" Mar 09 10:27:07 crc kubenswrapper[4792]: I0309 10:27:07.829060 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecee766e-be60-4c51-a24e-88dccfa0c460" containerName="oc" Mar 09 10:27:07 crc kubenswrapper[4792]: I0309 10:27:07.829281 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecee766e-be60-4c51-a24e-88dccfa0c460" containerName="oc" Mar 09 10:27:07 crc kubenswrapper[4792]: I0309 10:27:07.830676 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:07 crc kubenswrapper[4792]: I0309 10:27:07.896844 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mjlrt"] Mar 09 10:27:07 crc kubenswrapper[4792]: I0309 10:27:07.931604 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z652q\" (UniqueName: \"kubernetes.io/projected/397353c7-7ce1-49b6-abbe-5b8bd63419c9-kube-api-access-z652q\") pod \"community-operators-mjlrt\" (UID: \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\") " pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:07 crc kubenswrapper[4792]: I0309 10:27:07.931729 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397353c7-7ce1-49b6-abbe-5b8bd63419c9-catalog-content\") pod \"community-operators-mjlrt\" (UID: \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\") " pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:07 crc kubenswrapper[4792]: I0309 10:27:07.931877 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397353c7-7ce1-49b6-abbe-5b8bd63419c9-utilities\") pod \"community-operators-mjlrt\" (UID: \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\") " pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:08 crc kubenswrapper[4792]: I0309 10:27:08.033915 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z652q\" (UniqueName: \"kubernetes.io/projected/397353c7-7ce1-49b6-abbe-5b8bd63419c9-kube-api-access-z652q\") pod \"community-operators-mjlrt\" (UID: \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\") " pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:08 crc kubenswrapper[4792]: I0309 10:27:08.034020 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397353c7-7ce1-49b6-abbe-5b8bd63419c9-catalog-content\") pod \"community-operators-mjlrt\" (UID: \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\") " pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:08 crc kubenswrapper[4792]: I0309 10:27:08.034222 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397353c7-7ce1-49b6-abbe-5b8bd63419c9-utilities\") pod \"community-operators-mjlrt\" (UID: \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\") " pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:08 crc kubenswrapper[4792]: I0309 10:27:08.034661 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397353c7-7ce1-49b6-abbe-5b8bd63419c9-catalog-content\") pod \"community-operators-mjlrt\" (UID: \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\") " pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:08 crc kubenswrapper[4792]: I0309 10:27:08.034692 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397353c7-7ce1-49b6-abbe-5b8bd63419c9-utilities\") pod \"community-operators-mjlrt\" (UID: \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\") " pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:08 crc kubenswrapper[4792]: I0309 10:27:08.052988 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z652q\" (UniqueName: \"kubernetes.io/projected/397353c7-7ce1-49b6-abbe-5b8bd63419c9-kube-api-access-z652q\") pod \"community-operators-mjlrt\" (UID: \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\") " pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:08 crc kubenswrapper[4792]: I0309 10:27:08.152497 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:08 crc kubenswrapper[4792]: I0309 10:27:08.783852 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mjlrt"] Mar 09 10:27:08 crc kubenswrapper[4792]: I0309 10:27:08.803722 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjlrt" event={"ID":"397353c7-7ce1-49b6-abbe-5b8bd63419c9","Type":"ContainerStarted","Data":"b754becc7755a072bb078d621a90853eee696f9f88e88ac40e5b86fa51530bf9"} Mar 09 10:27:09 crc kubenswrapper[4792]: I0309 10:27:09.813539 4792 generic.go:334] "Generic (PLEG): container finished" podID="397353c7-7ce1-49b6-abbe-5b8bd63419c9" containerID="7e9edc311f6e4a22a4a138820d8b1628b877240231bedef683389434d99e8198" exitCode=0 Mar 09 10:27:09 crc kubenswrapper[4792]: I0309 10:27:09.813602 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjlrt" event={"ID":"397353c7-7ce1-49b6-abbe-5b8bd63419c9","Type":"ContainerDied","Data":"7e9edc311f6e4a22a4a138820d8b1628b877240231bedef683389434d99e8198"} Mar 09 10:27:09 crc kubenswrapper[4792]: I0309 10:27:09.816692 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 10:27:10 crc kubenswrapper[4792]: I0309 10:27:10.825954 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjlrt" event={"ID":"397353c7-7ce1-49b6-abbe-5b8bd63419c9","Type":"ContainerStarted","Data":"3236a38d0d884f80f0280be7ee6c395ec44ce7f4c736657b4820f6432ac8e218"} Mar 09 10:27:11 crc kubenswrapper[4792]: I0309 10:27:11.834863 4792 generic.go:334] "Generic (PLEG): container finished" podID="397353c7-7ce1-49b6-abbe-5b8bd63419c9" containerID="3236a38d0d884f80f0280be7ee6c395ec44ce7f4c736657b4820f6432ac8e218" exitCode=0 Mar 09 10:27:11 crc kubenswrapper[4792]: I0309 10:27:11.834916 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjlrt" event={"ID":"397353c7-7ce1-49b6-abbe-5b8bd63419c9","Type":"ContainerDied","Data":"3236a38d0d884f80f0280be7ee6c395ec44ce7f4c736657b4820f6432ac8e218"} Mar 09 10:27:12 crc kubenswrapper[4792]: I0309 10:27:12.849365 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjlrt" event={"ID":"397353c7-7ce1-49b6-abbe-5b8bd63419c9","Type":"ContainerStarted","Data":"e0f2d479181b723fdda18675cf3fda2c82d22d9504a54e54b5be4f0f973c65a0"} Mar 09 10:27:12 crc kubenswrapper[4792]: I0309 10:27:12.877199 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mjlrt" podStartSLOduration=3.451032634 podStartE2EDuration="5.877183846s" podCreationTimestamp="2026-03-09 10:27:07 +0000 UTC" firstStartedPulling="2026-03-09 10:27:09.816426753 +0000 UTC m=+4794.846627525" lastFinishedPulling="2026-03-09 10:27:12.242577985 +0000 UTC m=+4797.272778737" observedRunningTime="2026-03-09 10:27:12.875040358 +0000 UTC m=+4797.905241120" watchObservedRunningTime="2026-03-09 10:27:12.877183846 +0000 UTC m=+4797.907384598" Mar 09 10:27:13 crc kubenswrapper[4792]: I0309 10:27:13.214721 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:27:13 crc kubenswrapper[4792]: I0309 10:27:13.214779 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:27:18 crc kubenswrapper[4792]: I0309 10:27:18.153478 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:18 crc kubenswrapper[4792]: I0309 10:27:18.155095 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:18 crc kubenswrapper[4792]: I0309 10:27:18.199800 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:18 crc kubenswrapper[4792]: I0309 10:27:18.960492 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:19 crc kubenswrapper[4792]: I0309 10:27:19.010112 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mjlrt"] Mar 09 10:27:20 crc kubenswrapper[4792]: I0309 10:27:20.923341 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mjlrt" podUID="397353c7-7ce1-49b6-abbe-5b8bd63419c9" containerName="registry-server" containerID="cri-o://e0f2d479181b723fdda18675cf3fda2c82d22d9504a54e54b5be4f0f973c65a0" gracePeriod=2 Mar 09 10:27:21 crc kubenswrapper[4792]: I0309 10:27:21.922447 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:21 crc kubenswrapper[4792]: I0309 10:27:21.932730 4792 generic.go:334] "Generic (PLEG): container finished" podID="397353c7-7ce1-49b6-abbe-5b8bd63419c9" containerID="e0f2d479181b723fdda18675cf3fda2c82d22d9504a54e54b5be4f0f973c65a0" exitCode=0 Mar 09 10:27:21 crc kubenswrapper[4792]: I0309 10:27:21.932778 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjlrt" event={"ID":"397353c7-7ce1-49b6-abbe-5b8bd63419c9","Type":"ContainerDied","Data":"e0f2d479181b723fdda18675cf3fda2c82d22d9504a54e54b5be4f0f973c65a0"} Mar 09 10:27:21 crc kubenswrapper[4792]: I0309 10:27:21.932805 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mjlrt" event={"ID":"397353c7-7ce1-49b6-abbe-5b8bd63419c9","Type":"ContainerDied","Data":"b754becc7755a072bb078d621a90853eee696f9f88e88ac40e5b86fa51530bf9"} Mar 09 10:27:21 crc kubenswrapper[4792]: I0309 10:27:21.932825 4792 scope.go:117] "RemoveContainer" containerID="e0f2d479181b723fdda18675cf3fda2c82d22d9504a54e54b5be4f0f973c65a0" Mar 09 10:27:21 crc kubenswrapper[4792]: I0309 10:27:21.932850 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mjlrt" Mar 09 10:27:21 crc kubenswrapper[4792]: I0309 10:27:21.965252 4792 scope.go:117] "RemoveContainer" containerID="3236a38d0d884f80f0280be7ee6c395ec44ce7f4c736657b4820f6432ac8e218" Mar 09 10:27:21 crc kubenswrapper[4792]: I0309 10:27:21.965740 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397353c7-7ce1-49b6-abbe-5b8bd63419c9-catalog-content\") pod \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\" (UID: \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\") " Mar 09 10:27:21 crc kubenswrapper[4792]: I0309 10:27:21.965868 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z652q\" (UniqueName: \"kubernetes.io/projected/397353c7-7ce1-49b6-abbe-5b8bd63419c9-kube-api-access-z652q\") pod \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\" (UID: \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\") " Mar 09 10:27:21 crc kubenswrapper[4792]: I0309 10:27:21.966011 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397353c7-7ce1-49b6-abbe-5b8bd63419c9-utilities\") pod \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\" (UID: \"397353c7-7ce1-49b6-abbe-5b8bd63419c9\") " Mar 09 10:27:21 crc kubenswrapper[4792]: I0309 10:27:21.967917 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/397353c7-7ce1-49b6-abbe-5b8bd63419c9-utilities" (OuterVolumeSpecName: "utilities") pod "397353c7-7ce1-49b6-abbe-5b8bd63419c9" (UID: "397353c7-7ce1-49b6-abbe-5b8bd63419c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:27:21 crc kubenswrapper[4792]: I0309 10:27:21.985158 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/397353c7-7ce1-49b6-abbe-5b8bd63419c9-kube-api-access-z652q" (OuterVolumeSpecName: "kube-api-access-z652q") pod "397353c7-7ce1-49b6-abbe-5b8bd63419c9" (UID: "397353c7-7ce1-49b6-abbe-5b8bd63419c9"). InnerVolumeSpecName "kube-api-access-z652q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:27:22 crc kubenswrapper[4792]: I0309 10:27:22.042272 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/397353c7-7ce1-49b6-abbe-5b8bd63419c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "397353c7-7ce1-49b6-abbe-5b8bd63419c9" (UID: "397353c7-7ce1-49b6-abbe-5b8bd63419c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:27:22 crc kubenswrapper[4792]: I0309 10:27:22.047213 4792 scope.go:117] "RemoveContainer" containerID="7e9edc311f6e4a22a4a138820d8b1628b877240231bedef683389434d99e8198" Mar 09 10:27:22 crc kubenswrapper[4792]: I0309 10:27:22.068369 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/397353c7-7ce1-49b6-abbe-5b8bd63419c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:27:22 crc kubenswrapper[4792]: I0309 10:27:22.068397 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/397353c7-7ce1-49b6-abbe-5b8bd63419c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:27:22 crc kubenswrapper[4792]: I0309 10:27:22.068409 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z652q\" (UniqueName: \"kubernetes.io/projected/397353c7-7ce1-49b6-abbe-5b8bd63419c9-kube-api-access-z652q\") on node \"crc\" DevicePath \"\"" Mar 09 10:27:22 crc kubenswrapper[4792]: I0309 10:27:22.081584 4792 scope.go:117] "RemoveContainer" containerID="e0f2d479181b723fdda18675cf3fda2c82d22d9504a54e54b5be4f0f973c65a0" Mar 09 10:27:22 crc kubenswrapper[4792]: E0309 10:27:22.081979 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f2d479181b723fdda18675cf3fda2c82d22d9504a54e54b5be4f0f973c65a0\": container with ID starting with e0f2d479181b723fdda18675cf3fda2c82d22d9504a54e54b5be4f0f973c65a0 not found: ID does not exist" containerID="e0f2d479181b723fdda18675cf3fda2c82d22d9504a54e54b5be4f0f973c65a0" Mar 09 10:27:22 crc kubenswrapper[4792]: I0309 10:27:22.082015 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f2d479181b723fdda18675cf3fda2c82d22d9504a54e54b5be4f0f973c65a0"} err="failed to get container status \"e0f2d479181b723fdda18675cf3fda2c82d22d9504a54e54b5be4f0f973c65a0\": rpc error: code = NotFound desc = could not find container \"e0f2d479181b723fdda18675cf3fda2c82d22d9504a54e54b5be4f0f973c65a0\": container with ID starting with e0f2d479181b723fdda18675cf3fda2c82d22d9504a54e54b5be4f0f973c65a0 not found: ID does not exist" Mar 09 10:27:22 crc kubenswrapper[4792]: I0309 10:27:22.082040 4792 scope.go:117] "RemoveContainer" containerID="3236a38d0d884f80f0280be7ee6c395ec44ce7f4c736657b4820f6432ac8e218" Mar 09 10:27:22 crc kubenswrapper[4792]: E0309 10:27:22.082313 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3236a38d0d884f80f0280be7ee6c395ec44ce7f4c736657b4820f6432ac8e218\": container with ID starting with 3236a38d0d884f80f0280be7ee6c395ec44ce7f4c736657b4820f6432ac8e218 not found: ID does not exist" containerID="3236a38d0d884f80f0280be7ee6c395ec44ce7f4c736657b4820f6432ac8e218" Mar 09 10:27:22 crc kubenswrapper[4792]: I0309 10:27:22.082340 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3236a38d0d884f80f0280be7ee6c395ec44ce7f4c736657b4820f6432ac8e218"} err="failed to get container status \"3236a38d0d884f80f0280be7ee6c395ec44ce7f4c736657b4820f6432ac8e218\": rpc error: code = NotFound desc = could not find container \"3236a38d0d884f80f0280be7ee6c395ec44ce7f4c736657b4820f6432ac8e218\": container with ID starting with 3236a38d0d884f80f0280be7ee6c395ec44ce7f4c736657b4820f6432ac8e218 not found: ID does not exist" Mar 09 10:27:22 crc kubenswrapper[4792]: I0309 10:27:22.082357 4792 scope.go:117] "RemoveContainer" containerID="7e9edc311f6e4a22a4a138820d8b1628b877240231bedef683389434d99e8198" Mar 09 10:27:22 crc kubenswrapper[4792]: E0309 10:27:22.082963 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e9edc311f6e4a22a4a138820d8b1628b877240231bedef683389434d99e8198\": container with ID starting with 7e9edc311f6e4a22a4a138820d8b1628b877240231bedef683389434d99e8198 not found: ID does not exist" containerID="7e9edc311f6e4a22a4a138820d8b1628b877240231bedef683389434d99e8198" Mar 09 10:27:22 crc kubenswrapper[4792]: I0309 10:27:22.082985 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9edc311f6e4a22a4a138820d8b1628b877240231bedef683389434d99e8198"} err="failed to get container status \"7e9edc311f6e4a22a4a138820d8b1628b877240231bedef683389434d99e8198\": rpc error: code = NotFound desc = could not find container \"7e9edc311f6e4a22a4a138820d8b1628b877240231bedef683389434d99e8198\": container with ID starting with 7e9edc311f6e4a22a4a138820d8b1628b877240231bedef683389434d99e8198 not found: ID does not exist" Mar 09 10:27:22 crc kubenswrapper[4792]: I0309 10:27:22.285745 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mjlrt"] Mar 09 10:27:22 crc kubenswrapper[4792]: I0309 10:27:22.293503 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mjlrt"] Mar 09 10:27:23 crc kubenswrapper[4792]: I0309 10:27:23.674540 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="397353c7-7ce1-49b6-abbe-5b8bd63419c9" path="/var/lib/kubelet/pods/397353c7-7ce1-49b6-abbe-5b8bd63419c9/volumes" Mar 09 10:27:43 crc kubenswrapper[4792]: I0309 10:27:43.214700 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:27:43 crc kubenswrapper[4792]: I0309 10:27:43.215282 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:27:43 crc kubenswrapper[4792]: I0309 10:27:43.215340 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 10:27:43 crc kubenswrapper[4792]: I0309 10:27:43.215920 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6cb5e0b9a6c34e2e3e5787fefd658600bfed5b18ede479ce2bf2ac9c9b2e5218"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 10:27:43 crc kubenswrapper[4792]: I0309 10:27:43.215967 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://6cb5e0b9a6c34e2e3e5787fefd658600bfed5b18ede479ce2bf2ac9c9b2e5218" gracePeriod=600 Mar 09 10:27:44 crc kubenswrapper[4792]: I0309 10:27:44.147268 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="6cb5e0b9a6c34e2e3e5787fefd658600bfed5b18ede479ce2bf2ac9c9b2e5218" exitCode=0 Mar 09 10:27:44 crc kubenswrapper[4792]: I0309 10:27:44.147326 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"6cb5e0b9a6c34e2e3e5787fefd658600bfed5b18ede479ce2bf2ac9c9b2e5218"} Mar 09 10:27:44 crc kubenswrapper[4792]: I0309 10:27:44.147583 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e"} Mar 09 10:27:44 crc kubenswrapper[4792]: I0309 10:27:44.147629 4792 scope.go:117] "RemoveContainer" containerID="22fb615a80b561bd4fd46d065538c5d6038e92646216db0bd28365b1dcaceca2" Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.167795 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550868-wzzr4"] Mar 09 10:28:00 crc kubenswrapper[4792]: E0309 10:28:00.169513 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="397353c7-7ce1-49b6-abbe-5b8bd63419c9" containerName="extract-utilities" Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.169540 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="397353c7-7ce1-49b6-abbe-5b8bd63419c9" containerName="extract-utilities" Mar 09 10:28:00 crc kubenswrapper[4792]: E0309 10:28:00.169581 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="397353c7-7ce1-49b6-abbe-5b8bd63419c9" containerName="registry-server" Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.169591 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="397353c7-7ce1-49b6-abbe-5b8bd63419c9" containerName="registry-server" Mar 09 10:28:00 crc kubenswrapper[4792]: E0309 10:28:00.169622 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="397353c7-7ce1-49b6-abbe-5b8bd63419c9" containerName="extract-content" Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.169636 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="397353c7-7ce1-49b6-abbe-5b8bd63419c9" containerName="extract-content" Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.169906 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="397353c7-7ce1-49b6-abbe-5b8bd63419c9" containerName="registry-server" Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.171018 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550868-wzzr4" Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.178866 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550868-wzzr4"] Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.179886 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.180057 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.180375 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.299214 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4skj\" (UniqueName: \"kubernetes.io/projected/d3bb20bd-9a4f-4227-9feb-ed8d6954fbda-kube-api-access-m4skj\") pod \"auto-csr-approver-29550868-wzzr4\" (UID: \"d3bb20bd-9a4f-4227-9feb-ed8d6954fbda\") " pod="openshift-infra/auto-csr-approver-29550868-wzzr4" Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.402112 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4skj\" (UniqueName: \"kubernetes.io/projected/d3bb20bd-9a4f-4227-9feb-ed8d6954fbda-kube-api-access-m4skj\") pod \"auto-csr-approver-29550868-wzzr4\" (UID: \"d3bb20bd-9a4f-4227-9feb-ed8d6954fbda\") " pod="openshift-infra/auto-csr-approver-29550868-wzzr4" Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.424294 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4skj\" (UniqueName: \"kubernetes.io/projected/d3bb20bd-9a4f-4227-9feb-ed8d6954fbda-kube-api-access-m4skj\") pod \"auto-csr-approver-29550868-wzzr4\" (UID: \"d3bb20bd-9a4f-4227-9feb-ed8d6954fbda\") " pod="openshift-infra/auto-csr-approver-29550868-wzzr4" Mar 09 10:28:00 crc kubenswrapper[4792]: I0309 10:28:00.493465 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550868-wzzr4" Mar 09 10:28:01 crc kubenswrapper[4792]: I0309 10:28:01.108596 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550868-wzzr4"] Mar 09 10:28:01 crc kubenswrapper[4792]: I0309 10:28:01.285308 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550868-wzzr4" event={"ID":"d3bb20bd-9a4f-4227-9feb-ed8d6954fbda","Type":"ContainerStarted","Data":"f5bb8d72fbd163aeea23de888b8c83bdaa79b0c6b4278bc6460cb74642ef9c8d"} Mar 09 10:28:03 crc kubenswrapper[4792]: I0309 10:28:03.317163 4792 generic.go:334] "Generic (PLEG): container finished" podID="d3bb20bd-9a4f-4227-9feb-ed8d6954fbda" containerID="03c3857788da04320940b5453806946c3cb75df463a6cad9aa4982c36525bed1" exitCode=0 Mar 09 10:28:03 crc kubenswrapper[4792]: I0309 10:28:03.317264 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550868-wzzr4" event={"ID":"d3bb20bd-9a4f-4227-9feb-ed8d6954fbda","Type":"ContainerDied","Data":"03c3857788da04320940b5453806946c3cb75df463a6cad9aa4982c36525bed1"} Mar 09 10:28:04 crc kubenswrapper[4792]: I0309 10:28:04.677780 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550868-wzzr4" Mar 09 10:28:04 crc kubenswrapper[4792]: I0309 10:28:04.798443 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4skj\" (UniqueName: \"kubernetes.io/projected/d3bb20bd-9a4f-4227-9feb-ed8d6954fbda-kube-api-access-m4skj\") pod \"d3bb20bd-9a4f-4227-9feb-ed8d6954fbda\" (UID: \"d3bb20bd-9a4f-4227-9feb-ed8d6954fbda\") " Mar 09 10:28:04 crc kubenswrapper[4792]: I0309 10:28:04.815336 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3bb20bd-9a4f-4227-9feb-ed8d6954fbda-kube-api-access-m4skj" (OuterVolumeSpecName: "kube-api-access-m4skj") pod "d3bb20bd-9a4f-4227-9feb-ed8d6954fbda" (UID: "d3bb20bd-9a4f-4227-9feb-ed8d6954fbda"). InnerVolumeSpecName "kube-api-access-m4skj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:28:04 crc kubenswrapper[4792]: I0309 10:28:04.901464 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4skj\" (UniqueName: \"kubernetes.io/projected/d3bb20bd-9a4f-4227-9feb-ed8d6954fbda-kube-api-access-m4skj\") on node \"crc\" DevicePath \"\"" Mar 09 10:28:05 crc kubenswrapper[4792]: I0309 10:28:05.354768 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550868-wzzr4" event={"ID":"d3bb20bd-9a4f-4227-9feb-ed8d6954fbda","Type":"ContainerDied","Data":"f5bb8d72fbd163aeea23de888b8c83bdaa79b0c6b4278bc6460cb74642ef9c8d"} Mar 09 10:28:05 crc kubenswrapper[4792]: I0309 10:28:05.354815 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5bb8d72fbd163aeea23de888b8c83bdaa79b0c6b4278bc6460cb74642ef9c8d" Mar 09 10:28:05 crc kubenswrapper[4792]: I0309 10:28:05.354831 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550868-wzzr4" Mar 09 10:28:05 crc kubenswrapper[4792]: I0309 10:28:05.756194 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550862-9rhtt"] Mar 09 10:28:05 crc kubenswrapper[4792]: I0309 10:28:05.765196 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550862-9rhtt"] Mar 09 10:28:07 crc kubenswrapper[4792]: I0309 10:28:07.676868 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd4d06a7-d4c8-4ac1-ba1b-805062fe1835" path="/var/lib/kubelet/pods/cd4d06a7-d4c8-4ac1-ba1b-805062fe1835/volumes" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.628549 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8lmr2/must-gather-2ndc7"] Mar 09 10:28:17 crc kubenswrapper[4792]: E0309 10:28:17.636139 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3bb20bd-9a4f-4227-9feb-ed8d6954fbda" containerName="oc" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.640389 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bb20bd-9a4f-4227-9feb-ed8d6954fbda" containerName="oc" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.641022 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3bb20bd-9a4f-4227-9feb-ed8d6954fbda" containerName="oc" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.642998 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/must-gather-2ndc7" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.644023 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8lmr2/must-gather-2ndc7"] Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.648145 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8lmr2"/"openshift-service-ca.crt" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.648312 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8lmr2"/"kube-root-ca.crt" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.650025 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8lmr2"/"default-dockercfg-zbmr2" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.784764 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6lbl\" (UniqueName: \"kubernetes.io/projected/c41c4e17-e600-497f-a883-b33a517f0b95-kube-api-access-j6lbl\") pod \"must-gather-2ndc7\" (UID: \"c41c4e17-e600-497f-a883-b33a517f0b95\") " pod="openshift-must-gather-8lmr2/must-gather-2ndc7" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.784825 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c41c4e17-e600-497f-a883-b33a517f0b95-must-gather-output\") pod \"must-gather-2ndc7\" (UID: \"c41c4e17-e600-497f-a883-b33a517f0b95\") " pod="openshift-must-gather-8lmr2/must-gather-2ndc7" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.886179 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c41c4e17-e600-497f-a883-b33a517f0b95-must-gather-output\") pod \"must-gather-2ndc7\" (UID: \"c41c4e17-e600-497f-a883-b33a517f0b95\") " pod="openshift-must-gather-8lmr2/must-gather-2ndc7" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.886375 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6lbl\" (UniqueName: \"kubernetes.io/projected/c41c4e17-e600-497f-a883-b33a517f0b95-kube-api-access-j6lbl\") pod \"must-gather-2ndc7\" (UID: \"c41c4e17-e600-497f-a883-b33a517f0b95\") " pod="openshift-must-gather-8lmr2/must-gather-2ndc7" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.886966 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c41c4e17-e600-497f-a883-b33a517f0b95-must-gather-output\") pod \"must-gather-2ndc7\" (UID: \"c41c4e17-e600-497f-a883-b33a517f0b95\") " pod="openshift-must-gather-8lmr2/must-gather-2ndc7" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.906649 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6lbl\" (UniqueName: \"kubernetes.io/projected/c41c4e17-e600-497f-a883-b33a517f0b95-kube-api-access-j6lbl\") pod \"must-gather-2ndc7\" (UID: \"c41c4e17-e600-497f-a883-b33a517f0b95\") " pod="openshift-must-gather-8lmr2/must-gather-2ndc7" Mar 09 10:28:17 crc kubenswrapper[4792]: I0309 10:28:17.966819 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/must-gather-2ndc7" Mar 09 10:28:18 crc kubenswrapper[4792]: I0309 10:28:18.462727 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8lmr2/must-gather-2ndc7"] Mar 09 10:28:19 crc kubenswrapper[4792]: I0309 10:28:19.481786 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8lmr2/must-gather-2ndc7" event={"ID":"c41c4e17-e600-497f-a883-b33a517f0b95","Type":"ContainerStarted","Data":"4fadced1f0075f8723094196c94d83ee39ace19176d43e5f3fb89dcbaa0284b7"} Mar 09 10:28:19 crc kubenswrapper[4792]: I0309 10:28:19.482382 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8lmr2/must-gather-2ndc7" event={"ID":"c41c4e17-e600-497f-a883-b33a517f0b95","Type":"ContainerStarted","Data":"1e119ba893c108cc4b20ebb99f5dd976c7ef756cea0dbf47975c73c7e8024ca6"} Mar 09 10:28:19 crc kubenswrapper[4792]: I0309 10:28:19.482400 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8lmr2/must-gather-2ndc7" event={"ID":"c41c4e17-e600-497f-a883-b33a517f0b95","Type":"ContainerStarted","Data":"f88d9c4092fbc5dee4a4ca7c3deb95f31761405d426e643aee6e3efa8f9eb6d1"} Mar 09 10:28:19 crc kubenswrapper[4792]: I0309 10:28:19.505621 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8lmr2/must-gather-2ndc7" podStartSLOduration=2.505604675 podStartE2EDuration="2.505604675s" podCreationTimestamp="2026-03-09 10:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:28:19.505175654 +0000 UTC m=+4864.535376416" watchObservedRunningTime="2026-03-09 10:28:19.505604675 +0000 UTC m=+4864.535805447" Mar 09 10:28:24 crc kubenswrapper[4792]: I0309 10:28:24.447015 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8lmr2/crc-debug-cq852"] Mar 09 10:28:24 crc kubenswrapper[4792]: I0309 10:28:24.452200 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/crc-debug-cq852" Mar 09 10:28:24 crc kubenswrapper[4792]: I0309 10:28:24.538152 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f96sz\" (UniqueName: \"kubernetes.io/projected/da60eeb2-86d6-40b8-bdde-407c5b7b5d8f-kube-api-access-f96sz\") pod \"crc-debug-cq852\" (UID: \"da60eeb2-86d6-40b8-bdde-407c5b7b5d8f\") " pod="openshift-must-gather-8lmr2/crc-debug-cq852" Mar 09 10:28:24 crc kubenswrapper[4792]: I0309 10:28:24.538289 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da60eeb2-86d6-40b8-bdde-407c5b7b5d8f-host\") pod \"crc-debug-cq852\" (UID: \"da60eeb2-86d6-40b8-bdde-407c5b7b5d8f\") " pod="openshift-must-gather-8lmr2/crc-debug-cq852" Mar 09 10:28:24 crc kubenswrapper[4792]: I0309 10:28:24.640930 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f96sz\" (UniqueName: \"kubernetes.io/projected/da60eeb2-86d6-40b8-bdde-407c5b7b5d8f-kube-api-access-f96sz\") pod \"crc-debug-cq852\" (UID: \"da60eeb2-86d6-40b8-bdde-407c5b7b5d8f\") " pod="openshift-must-gather-8lmr2/crc-debug-cq852" Mar 09 10:28:24 crc kubenswrapper[4792]: I0309 10:28:24.641439 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da60eeb2-86d6-40b8-bdde-407c5b7b5d8f-host\") pod \"crc-debug-cq852\" (UID: \"da60eeb2-86d6-40b8-bdde-407c5b7b5d8f\") " pod="openshift-must-gather-8lmr2/crc-debug-cq852" Mar 09 10:28:24 crc kubenswrapper[4792]: I0309 10:28:24.641554 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da60eeb2-86d6-40b8-bdde-407c5b7b5d8f-host\") pod \"crc-debug-cq852\" (UID: \"da60eeb2-86d6-40b8-bdde-407c5b7b5d8f\") " pod="openshift-must-gather-8lmr2/crc-debug-cq852" Mar 09 10:28:24 crc kubenswrapper[4792]: I0309 10:28:24.660296 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f96sz\" (UniqueName: \"kubernetes.io/projected/da60eeb2-86d6-40b8-bdde-407c5b7b5d8f-kube-api-access-f96sz\") pod \"crc-debug-cq852\" (UID: \"da60eeb2-86d6-40b8-bdde-407c5b7b5d8f\") " pod="openshift-must-gather-8lmr2/crc-debug-cq852" Mar 09 10:28:24 crc kubenswrapper[4792]: I0309 10:28:24.770260 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/crc-debug-cq852" Mar 09 10:28:25 crc kubenswrapper[4792]: I0309 10:28:25.548352 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8lmr2/crc-debug-cq852" event={"ID":"da60eeb2-86d6-40b8-bdde-407c5b7b5d8f","Type":"ContainerStarted","Data":"d806f8a0c3197a462efcb87b33492d9a467b40c29546fe2ebf0339decc37de09"} Mar 09 10:28:25 crc kubenswrapper[4792]: I0309 10:28:25.549353 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8lmr2/crc-debug-cq852" event={"ID":"da60eeb2-86d6-40b8-bdde-407c5b7b5d8f","Type":"ContainerStarted","Data":"2dde458fea9e1b9d4345d298c9f0a342fc72e4b0297128bb1faea4c4c52cb1c4"} Mar 09 10:28:25 crc kubenswrapper[4792]: I0309 10:28:25.574995 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8lmr2/crc-debug-cq852" podStartSLOduration=1.5749802750000002 podStartE2EDuration="1.574980275s" podCreationTimestamp="2026-03-09 10:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:28:25.57295946 +0000 UTC m=+4870.603160212" watchObservedRunningTime="2026-03-09 10:28:25.574980275 +0000 UTC m=+4870.605181027" Mar 09 10:28:45 crc kubenswrapper[4792]: I0309 10:28:45.386942 4792 scope.go:117] "RemoveContainer" containerID="fdf3f73924507040d48037e097e8806e94be759a110c81dc5bb9e6b7923f7b80" Mar 09 10:29:05 crc kubenswrapper[4792]: I0309 10:29:05.896617 4792 generic.go:334] "Generic (PLEG): container finished" podID="da60eeb2-86d6-40b8-bdde-407c5b7b5d8f" containerID="d806f8a0c3197a462efcb87b33492d9a467b40c29546fe2ebf0339decc37de09" exitCode=0 Mar 09 10:29:05 crc kubenswrapper[4792]: I0309 10:29:05.896705 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8lmr2/crc-debug-cq852" event={"ID":"da60eeb2-86d6-40b8-bdde-407c5b7b5d8f","Type":"ContainerDied","Data":"d806f8a0c3197a462efcb87b33492d9a467b40c29546fe2ebf0339decc37de09"} Mar 09 10:29:07 crc kubenswrapper[4792]: I0309 10:29:07.029809 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/crc-debug-cq852" Mar 09 10:29:07 crc kubenswrapper[4792]: I0309 10:29:07.091349 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8lmr2/crc-debug-cq852"] Mar 09 10:29:07 crc kubenswrapper[4792]: I0309 10:29:07.100595 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8lmr2/crc-debug-cq852"] Mar 09 10:29:07 crc kubenswrapper[4792]: I0309 10:29:07.116293 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f96sz\" (UniqueName: \"kubernetes.io/projected/da60eeb2-86d6-40b8-bdde-407c5b7b5d8f-kube-api-access-f96sz\") pod \"da60eeb2-86d6-40b8-bdde-407c5b7b5d8f\" (UID: \"da60eeb2-86d6-40b8-bdde-407c5b7b5d8f\") " Mar 09 10:29:07 crc kubenswrapper[4792]: I0309 10:29:07.116606 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da60eeb2-86d6-40b8-bdde-407c5b7b5d8f-host\") pod \"da60eeb2-86d6-40b8-bdde-407c5b7b5d8f\" (UID: \"da60eeb2-86d6-40b8-bdde-407c5b7b5d8f\") " Mar 09 10:29:07 crc kubenswrapper[4792]: I0309 10:29:07.116714 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da60eeb2-86d6-40b8-bdde-407c5b7b5d8f-host" (OuterVolumeSpecName: "host") pod "da60eeb2-86d6-40b8-bdde-407c5b7b5d8f" (UID: "da60eeb2-86d6-40b8-bdde-407c5b7b5d8f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:29:07 crc kubenswrapper[4792]: I0309 10:29:07.116990 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da60eeb2-86d6-40b8-bdde-407c5b7b5d8f-host\") on node \"crc\" DevicePath \"\"" Mar 09 10:29:07 crc kubenswrapper[4792]: I0309 10:29:07.126374 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da60eeb2-86d6-40b8-bdde-407c5b7b5d8f-kube-api-access-f96sz" (OuterVolumeSpecName: "kube-api-access-f96sz") pod "da60eeb2-86d6-40b8-bdde-407c5b7b5d8f" (UID: "da60eeb2-86d6-40b8-bdde-407c5b7b5d8f"). InnerVolumeSpecName "kube-api-access-f96sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:29:07 crc kubenswrapper[4792]: I0309 10:29:07.218654 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f96sz\" (UniqueName: \"kubernetes.io/projected/da60eeb2-86d6-40b8-bdde-407c5b7b5d8f-kube-api-access-f96sz\") on node \"crc\" DevicePath \"\"" Mar 09 10:29:07 crc kubenswrapper[4792]: I0309 10:29:07.674213 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da60eeb2-86d6-40b8-bdde-407c5b7b5d8f" path="/var/lib/kubelet/pods/da60eeb2-86d6-40b8-bdde-407c5b7b5d8f/volumes" Mar 09 10:29:07 crc kubenswrapper[4792]: I0309 10:29:07.915146 4792 scope.go:117] "RemoveContainer" containerID="d806f8a0c3197a462efcb87b33492d9a467b40c29546fe2ebf0339decc37de09" Mar 09 10:29:07 crc kubenswrapper[4792]: I0309 10:29:07.915198 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/crc-debug-cq852" Mar 09 10:29:08 crc kubenswrapper[4792]: I0309 10:29:08.350623 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8lmr2/crc-debug-jvb2m"] Mar 09 10:29:08 crc kubenswrapper[4792]: E0309 10:29:08.351141 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da60eeb2-86d6-40b8-bdde-407c5b7b5d8f" containerName="container-00" Mar 09 10:29:08 crc kubenswrapper[4792]: I0309 10:29:08.351158 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="da60eeb2-86d6-40b8-bdde-407c5b7b5d8f" containerName="container-00" Mar 09 10:29:08 crc kubenswrapper[4792]: I0309 10:29:08.351349 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="da60eeb2-86d6-40b8-bdde-407c5b7b5d8f" containerName="container-00" Mar 09 10:29:08 crc kubenswrapper[4792]: I0309 10:29:08.352066 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/crc-debug-jvb2m" Mar 09 10:29:08 crc kubenswrapper[4792]: I0309 10:29:08.458294 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ffbc89a-cf31-4064-b3da-cae8b757fabc-host\") pod \"crc-debug-jvb2m\" (UID: \"7ffbc89a-cf31-4064-b3da-cae8b757fabc\") " pod="openshift-must-gather-8lmr2/crc-debug-jvb2m" Mar 09 10:29:08 crc kubenswrapper[4792]: I0309 10:29:08.458537 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cftr5\" (UniqueName: \"kubernetes.io/projected/7ffbc89a-cf31-4064-b3da-cae8b757fabc-kube-api-access-cftr5\") pod \"crc-debug-jvb2m\" (UID: \"7ffbc89a-cf31-4064-b3da-cae8b757fabc\") " pod="openshift-must-gather-8lmr2/crc-debug-jvb2m" Mar 09 10:29:08 crc kubenswrapper[4792]: I0309 10:29:08.561007 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ffbc89a-cf31-4064-b3da-cae8b757fabc-host\") pod \"crc-debug-jvb2m\" (UID: \"7ffbc89a-cf31-4064-b3da-cae8b757fabc\") " pod="openshift-must-gather-8lmr2/crc-debug-jvb2m" Mar 09 10:29:08 crc kubenswrapper[4792]: I0309 10:29:08.561445 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cftr5\" (UniqueName: \"kubernetes.io/projected/7ffbc89a-cf31-4064-b3da-cae8b757fabc-kube-api-access-cftr5\") pod \"crc-debug-jvb2m\" (UID: \"7ffbc89a-cf31-4064-b3da-cae8b757fabc\") " pod="openshift-must-gather-8lmr2/crc-debug-jvb2m" Mar 09 10:29:08 crc kubenswrapper[4792]: I0309 10:29:08.561182 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ffbc89a-cf31-4064-b3da-cae8b757fabc-host\") pod \"crc-debug-jvb2m\" (UID: \"7ffbc89a-cf31-4064-b3da-cae8b757fabc\") " pod="openshift-must-gather-8lmr2/crc-debug-jvb2m" Mar 09 10:29:08 crc kubenswrapper[4792]: I0309 10:29:08.581919 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cftr5\" (UniqueName: \"kubernetes.io/projected/7ffbc89a-cf31-4064-b3da-cae8b757fabc-kube-api-access-cftr5\") pod \"crc-debug-jvb2m\" (UID: \"7ffbc89a-cf31-4064-b3da-cae8b757fabc\") " pod="openshift-must-gather-8lmr2/crc-debug-jvb2m" Mar 09 10:29:08 crc kubenswrapper[4792]: I0309 10:29:08.671720 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/crc-debug-jvb2m" Mar 09 10:29:08 crc kubenswrapper[4792]: I0309 10:29:08.924905 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8lmr2/crc-debug-jvb2m" event={"ID":"7ffbc89a-cf31-4064-b3da-cae8b757fabc","Type":"ContainerStarted","Data":"6b8471d78b441c30e88bd013aebaf29d8deeb90a490a03200158525a37cab31a"} Mar 09 10:29:09 crc kubenswrapper[4792]: I0309 10:29:09.935276 4792 generic.go:334] "Generic (PLEG): container finished" podID="7ffbc89a-cf31-4064-b3da-cae8b757fabc" containerID="3b3442ac40c5a252a9c44d109ccd570d57cfd37b40b4947d92f9440c358dac54" exitCode=0 Mar 09 10:29:09 crc kubenswrapper[4792]: I0309 10:29:09.935341 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8lmr2/crc-debug-jvb2m" event={"ID":"7ffbc89a-cf31-4064-b3da-cae8b757fabc","Type":"ContainerDied","Data":"3b3442ac40c5a252a9c44d109ccd570d57cfd37b40b4947d92f9440c358dac54"} Mar 09 10:29:10 crc kubenswrapper[4792]: I0309 10:29:10.361163 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8lmr2/crc-debug-jvb2m"] Mar 09 10:29:10 crc kubenswrapper[4792]: I0309 10:29:10.375033 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8lmr2/crc-debug-jvb2m"] Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.041814 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/crc-debug-jvb2m" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.109103 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cftr5\" (UniqueName: \"kubernetes.io/projected/7ffbc89a-cf31-4064-b3da-cae8b757fabc-kube-api-access-cftr5\") pod \"7ffbc89a-cf31-4064-b3da-cae8b757fabc\" (UID: \"7ffbc89a-cf31-4064-b3da-cae8b757fabc\") " Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.109357 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ffbc89a-cf31-4064-b3da-cae8b757fabc-host\") pod \"7ffbc89a-cf31-4064-b3da-cae8b757fabc\" (UID: \"7ffbc89a-cf31-4064-b3da-cae8b757fabc\") " Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.109394 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ffbc89a-cf31-4064-b3da-cae8b757fabc-host" (OuterVolumeSpecName: "host") pod "7ffbc89a-cf31-4064-b3da-cae8b757fabc" (UID: "7ffbc89a-cf31-4064-b3da-cae8b757fabc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.109954 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ffbc89a-cf31-4064-b3da-cae8b757fabc-host\") on node \"crc\" DevicePath \"\"" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.120390 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ffbc89a-cf31-4064-b3da-cae8b757fabc-kube-api-access-cftr5" (OuterVolumeSpecName: "kube-api-access-cftr5") pod "7ffbc89a-cf31-4064-b3da-cae8b757fabc" (UID: "7ffbc89a-cf31-4064-b3da-cae8b757fabc"). InnerVolumeSpecName "kube-api-access-cftr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.212084 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cftr5\" (UniqueName: \"kubernetes.io/projected/7ffbc89a-cf31-4064-b3da-cae8b757fabc-kube-api-access-cftr5\") on node \"crc\" DevicePath \"\"" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.563628 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8lmr2/crc-debug-qfsbx"] Mar 09 10:29:11 crc kubenswrapper[4792]: E0309 10:29:11.564383 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ffbc89a-cf31-4064-b3da-cae8b757fabc" containerName="container-00" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.564403 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ffbc89a-cf31-4064-b3da-cae8b757fabc" containerName="container-00" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.564641 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ffbc89a-cf31-4064-b3da-cae8b757fabc" containerName="container-00" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.565378 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/crc-debug-qfsbx" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.620238 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhdsd\" (UniqueName: \"kubernetes.io/projected/7ba67791-b72d-4bae-9466-fed354845385-kube-api-access-rhdsd\") pod \"crc-debug-qfsbx\" (UID: \"7ba67791-b72d-4bae-9466-fed354845385\") " pod="openshift-must-gather-8lmr2/crc-debug-qfsbx" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.620386 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ba67791-b72d-4bae-9466-fed354845385-host\") pod \"crc-debug-qfsbx\" (UID: \"7ba67791-b72d-4bae-9466-fed354845385\") " pod="openshift-must-gather-8lmr2/crc-debug-qfsbx" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.689793 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ffbc89a-cf31-4064-b3da-cae8b757fabc" path="/var/lib/kubelet/pods/7ffbc89a-cf31-4064-b3da-cae8b757fabc/volumes" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.723024 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ba67791-b72d-4bae-9466-fed354845385-host\") pod \"crc-debug-qfsbx\" (UID: \"7ba67791-b72d-4bae-9466-fed354845385\") " pod="openshift-must-gather-8lmr2/crc-debug-qfsbx" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.723290 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhdsd\" (UniqueName: \"kubernetes.io/projected/7ba67791-b72d-4bae-9466-fed354845385-kube-api-access-rhdsd\") pod \"crc-debug-qfsbx\" (UID: \"7ba67791-b72d-4bae-9466-fed354845385\") " pod="openshift-must-gather-8lmr2/crc-debug-qfsbx" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.723916 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ba67791-b72d-4bae-9466-fed354845385-host\") pod \"crc-debug-qfsbx\" (UID: \"7ba67791-b72d-4bae-9466-fed354845385\") " pod="openshift-must-gather-8lmr2/crc-debug-qfsbx" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.740393 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhdsd\" (UniqueName: \"kubernetes.io/projected/7ba67791-b72d-4bae-9466-fed354845385-kube-api-access-rhdsd\") pod \"crc-debug-qfsbx\" (UID: \"7ba67791-b72d-4bae-9466-fed354845385\") " pod="openshift-must-gather-8lmr2/crc-debug-qfsbx" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.879884 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/crc-debug-qfsbx" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.959185 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8lmr2/crc-debug-qfsbx" event={"ID":"7ba67791-b72d-4bae-9466-fed354845385","Type":"ContainerStarted","Data":"bdd03180b72410291afbce3cd1e3a4c8ddb6030f69c903a71b26c2dcc6787f54"} Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.961006 4792 scope.go:117] "RemoveContainer" containerID="3b3442ac40c5a252a9c44d109ccd570d57cfd37b40b4947d92f9440c358dac54" Mar 09 10:29:11 crc kubenswrapper[4792]: I0309 10:29:11.961186 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/crc-debug-jvb2m" Mar 09 10:29:12 crc kubenswrapper[4792]: I0309 10:29:12.971163 4792 generic.go:334] "Generic (PLEG): container finished" podID="7ba67791-b72d-4bae-9466-fed354845385" containerID="8a5a701d55d07d3078a08d61ddf895d3082cad44ebe7f5d0bf9149cd551364bc" exitCode=0 Mar 09 10:29:12 crc kubenswrapper[4792]: I0309 10:29:12.971260 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8lmr2/crc-debug-qfsbx" event={"ID":"7ba67791-b72d-4bae-9466-fed354845385","Type":"ContainerDied","Data":"8a5a701d55d07d3078a08d61ddf895d3082cad44ebe7f5d0bf9149cd551364bc"} Mar 09 10:29:13 crc kubenswrapper[4792]: I0309 10:29:13.009086 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8lmr2/crc-debug-qfsbx"] Mar 09 10:29:13 crc kubenswrapper[4792]: I0309 10:29:13.016673 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8lmr2/crc-debug-qfsbx"] Mar 09 10:29:14 crc kubenswrapper[4792]: I0309 10:29:14.091871 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/crc-debug-qfsbx" Mar 09 10:29:14 crc kubenswrapper[4792]: I0309 10:29:14.174356 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhdsd\" (UniqueName: \"kubernetes.io/projected/7ba67791-b72d-4bae-9466-fed354845385-kube-api-access-rhdsd\") pod \"7ba67791-b72d-4bae-9466-fed354845385\" (UID: \"7ba67791-b72d-4bae-9466-fed354845385\") " Mar 09 10:29:14 crc kubenswrapper[4792]: I0309 10:29:14.174561 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ba67791-b72d-4bae-9466-fed354845385-host\") pod \"7ba67791-b72d-4bae-9466-fed354845385\" (UID: \"7ba67791-b72d-4bae-9466-fed354845385\") " Mar 09 10:29:14 crc kubenswrapper[4792]: I0309 10:29:14.174773 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ba67791-b72d-4bae-9466-fed354845385-host" (OuterVolumeSpecName: "host") pod "7ba67791-b72d-4bae-9466-fed354845385" (UID: "7ba67791-b72d-4bae-9466-fed354845385"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:29:14 crc kubenswrapper[4792]: I0309 10:29:14.175507 4792 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7ba67791-b72d-4bae-9466-fed354845385-host\") on node \"crc\" DevicePath \"\"" Mar 09 10:29:14 crc kubenswrapper[4792]: I0309 10:29:14.180316 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ba67791-b72d-4bae-9466-fed354845385-kube-api-access-rhdsd" (OuterVolumeSpecName: "kube-api-access-rhdsd") pod "7ba67791-b72d-4bae-9466-fed354845385" (UID: "7ba67791-b72d-4bae-9466-fed354845385"). InnerVolumeSpecName "kube-api-access-rhdsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:29:14 crc kubenswrapper[4792]: I0309 10:29:14.277769 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhdsd\" (UniqueName: \"kubernetes.io/projected/7ba67791-b72d-4bae-9466-fed354845385-kube-api-access-rhdsd\") on node \"crc\" DevicePath \"\"" Mar 09 10:29:15 crc kubenswrapper[4792]: I0309 10:29:15.001810 4792 scope.go:117] "RemoveContainer" containerID="8a5a701d55d07d3078a08d61ddf895d3082cad44ebe7f5d0bf9149cd551364bc" Mar 09 10:29:15 crc kubenswrapper[4792]: I0309 10:29:15.001891 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/crc-debug-qfsbx" Mar 09 10:29:15 crc kubenswrapper[4792]: E0309 10:29:15.234030 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ba67791_b72d_4bae_9466_fed354845385.slice/crio-bdd03180b72410291afbce3cd1e3a4c8ddb6030f69c903a71b26c2dcc6787f54\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ba67791_b72d_4bae_9466_fed354845385.slice\": RecentStats: unable to find data in memory cache]" Mar 09 10:29:15 crc kubenswrapper[4792]: I0309 10:29:15.708063 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ba67791-b72d-4bae-9466-fed354845385" path="/var/lib/kubelet/pods/7ba67791-b72d-4bae-9466-fed354845385/volumes" Mar 09 10:29:43 crc kubenswrapper[4792]: I0309 10:29:43.214370 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:29:43 crc kubenswrapper[4792]: I0309 10:29:43.215368 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.163277 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550870-45r5t"] Mar 09 10:30:00 crc kubenswrapper[4792]: E0309 10:30:00.164217 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba67791-b72d-4bae-9466-fed354845385" containerName="container-00" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.164230 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba67791-b72d-4bae-9466-fed354845385" containerName="container-00" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.164420 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ba67791-b72d-4bae-9466-fed354845385" containerName="container-00" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.165061 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b"] Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.165305 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550870-45r5t" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.167244 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.167835 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.169503 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.169845 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.170097 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.173187 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550870-45r5t"] Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.175875 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.191640 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/741ae386-433b-4991-a5af-03faa3c1a8cd-config-volume\") pod \"collect-profiles-29550870-s5k6b\" (UID: \"741ae386-433b-4991-a5af-03faa3c1a8cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.191770 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fffnd\" (UniqueName: \"kubernetes.io/projected/741ae386-433b-4991-a5af-03faa3c1a8cd-kube-api-access-fffnd\") pod \"collect-profiles-29550870-s5k6b\" (UID: \"741ae386-433b-4991-a5af-03faa3c1a8cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.192057 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b"] Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.192370 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mj4l\" (UniqueName: \"kubernetes.io/projected/c9a92e2f-33fd-44a9-89f2-2717e41e9d2a-kube-api-access-7mj4l\") pod \"auto-csr-approver-29550870-45r5t\" (UID: \"c9a92e2f-33fd-44a9-89f2-2717e41e9d2a\") " pod="openshift-infra/auto-csr-approver-29550870-45r5t" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.192526 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/741ae386-433b-4991-a5af-03faa3c1a8cd-secret-volume\") pod \"collect-profiles-29550870-s5k6b\" (UID: \"741ae386-433b-4991-a5af-03faa3c1a8cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.293934 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/741ae386-433b-4991-a5af-03faa3c1a8cd-secret-volume\") pod \"collect-profiles-29550870-s5k6b\" (UID: \"741ae386-433b-4991-a5af-03faa3c1a8cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.294237 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/741ae386-433b-4991-a5af-03faa3c1a8cd-config-volume\") pod \"collect-profiles-29550870-s5k6b\" (UID: \"741ae386-433b-4991-a5af-03faa3c1a8cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.294354 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fffnd\" (UniqueName: \"kubernetes.io/projected/741ae386-433b-4991-a5af-03faa3c1a8cd-kube-api-access-fffnd\") pod \"collect-profiles-29550870-s5k6b\" (UID: \"741ae386-433b-4991-a5af-03faa3c1a8cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.294599 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mj4l\" (UniqueName: \"kubernetes.io/projected/c9a92e2f-33fd-44a9-89f2-2717e41e9d2a-kube-api-access-7mj4l\") pod \"auto-csr-approver-29550870-45r5t\" (UID: \"c9a92e2f-33fd-44a9-89f2-2717e41e9d2a\") " pod="openshift-infra/auto-csr-approver-29550870-45r5t" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.295451 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/741ae386-433b-4991-a5af-03faa3c1a8cd-config-volume\") pod \"collect-profiles-29550870-s5k6b\" (UID: \"741ae386-433b-4991-a5af-03faa3c1a8cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.303823 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/741ae386-433b-4991-a5af-03faa3c1a8cd-secret-volume\") pod \"collect-profiles-29550870-s5k6b\" (UID: \"741ae386-433b-4991-a5af-03faa3c1a8cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.313968 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fffnd\" (UniqueName: \"kubernetes.io/projected/741ae386-433b-4991-a5af-03faa3c1a8cd-kube-api-access-fffnd\") pod \"collect-profiles-29550870-s5k6b\" (UID: \"741ae386-433b-4991-a5af-03faa3c1a8cd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.315469 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mj4l\" (UniqueName: \"kubernetes.io/projected/c9a92e2f-33fd-44a9-89f2-2717e41e9d2a-kube-api-access-7mj4l\") pod \"auto-csr-approver-29550870-45r5t\" (UID: \"c9a92e2f-33fd-44a9-89f2-2717e41e9d2a\") " pod="openshift-infra/auto-csr-approver-29550870-45r5t" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.495530 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550870-45r5t" Mar 09 10:30:00 crc kubenswrapper[4792]: I0309 10:30:00.504360 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" Mar 09 10:30:01 crc kubenswrapper[4792]: I0309 10:30:01.087347 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b"] Mar 09 10:30:01 crc kubenswrapper[4792]: I0309 10:30:01.179898 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550870-45r5t"] Mar 09 10:30:01 crc kubenswrapper[4792]: W0309 10:30:01.193666 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9a92e2f_33fd_44a9_89f2_2717e41e9d2a.slice/crio-a42c34a5c11d10240a5918d42f592cea75705994d2c93b63fd6e1d17ab71202d WatchSource:0}: Error finding container a42c34a5c11d10240a5918d42f592cea75705994d2c93b63fd6e1d17ab71202d: Status 404 returned error can't find the container with id a42c34a5c11d10240a5918d42f592cea75705994d2c93b63fd6e1d17ab71202d Mar 09 10:30:01 crc kubenswrapper[4792]: I0309 10:30:01.425928 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" event={"ID":"741ae386-433b-4991-a5af-03faa3c1a8cd","Type":"ContainerStarted","Data":"296ede013d6c3ba51fbb615329c1d2fa1616fd0b5cf6d48ab3849c0e3b701d02"} Mar 09 10:30:01 crc kubenswrapper[4792]: I0309 10:30:01.427695 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" event={"ID":"741ae386-433b-4991-a5af-03faa3c1a8cd","Type":"ContainerStarted","Data":"245aaee1e5888ee0471a6ce29a54a0050e98a5cd36991ff4556cbd0873b682c7"} Mar 09 10:30:01 crc kubenswrapper[4792]: I0309 10:30:01.429881 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550870-45r5t" event={"ID":"c9a92e2f-33fd-44a9-89f2-2717e41e9d2a","Type":"ContainerStarted","Data":"a42c34a5c11d10240a5918d42f592cea75705994d2c93b63fd6e1d17ab71202d"} Mar 09 10:30:01 crc kubenswrapper[4792]: I0309 10:30:01.449519 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" podStartSLOduration=1.449500847 podStartE2EDuration="1.449500847s" podCreationTimestamp="2026-03-09 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:30:01.443820243 +0000 UTC m=+4966.474021005" watchObservedRunningTime="2026-03-09 10:30:01.449500847 +0000 UTC m=+4966.479701599" Mar 09 10:30:02 crc kubenswrapper[4792]: I0309 10:30:02.446390 4792 generic.go:334] "Generic (PLEG): container finished" podID="741ae386-433b-4991-a5af-03faa3c1a8cd" containerID="296ede013d6c3ba51fbb615329c1d2fa1616fd0b5cf6d48ab3849c0e3b701d02" exitCode=0 Mar 09 10:30:02 crc kubenswrapper[4792]: I0309 10:30:02.446732 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" event={"ID":"741ae386-433b-4991-a5af-03faa3c1a8cd","Type":"ContainerDied","Data":"296ede013d6c3ba51fbb615329c1d2fa1616fd0b5cf6d48ab3849c0e3b701d02"} Mar 09 10:30:03 crc kubenswrapper[4792]: I0309 10:30:03.821579 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" Mar 09 10:30:03 crc kubenswrapper[4792]: I0309 10:30:03.975811 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/741ae386-433b-4991-a5af-03faa3c1a8cd-config-volume\") pod \"741ae386-433b-4991-a5af-03faa3c1a8cd\" (UID: \"741ae386-433b-4991-a5af-03faa3c1a8cd\") " Mar 09 10:30:03 crc kubenswrapper[4792]: I0309 10:30:03.975989 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fffnd\" (UniqueName: \"kubernetes.io/projected/741ae386-433b-4991-a5af-03faa3c1a8cd-kube-api-access-fffnd\") pod \"741ae386-433b-4991-a5af-03faa3c1a8cd\" (UID: \"741ae386-433b-4991-a5af-03faa3c1a8cd\") " Mar 09 10:30:03 crc kubenswrapper[4792]: I0309 10:30:03.976031 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/741ae386-433b-4991-a5af-03faa3c1a8cd-secret-volume\") pod \"741ae386-433b-4991-a5af-03faa3c1a8cd\" (UID: \"741ae386-433b-4991-a5af-03faa3c1a8cd\") " Mar 09 10:30:03 crc kubenswrapper[4792]: I0309 10:30:03.976839 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/741ae386-433b-4991-a5af-03faa3c1a8cd-config-volume" (OuterVolumeSpecName: "config-volume") pod "741ae386-433b-4991-a5af-03faa3c1a8cd" (UID: "741ae386-433b-4991-a5af-03faa3c1a8cd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:30:03 crc kubenswrapper[4792]: I0309 10:30:03.988499 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/741ae386-433b-4991-a5af-03faa3c1a8cd-kube-api-access-fffnd" (OuterVolumeSpecName: "kube-api-access-fffnd") pod "741ae386-433b-4991-a5af-03faa3c1a8cd" (UID: "741ae386-433b-4991-a5af-03faa3c1a8cd"). InnerVolumeSpecName "kube-api-access-fffnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:30:03 crc kubenswrapper[4792]: I0309 10:30:03.991334 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/741ae386-433b-4991-a5af-03faa3c1a8cd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "741ae386-433b-4991-a5af-03faa3c1a8cd" (UID: "741ae386-433b-4991-a5af-03faa3c1a8cd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:30:04 crc kubenswrapper[4792]: I0309 10:30:04.078988 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fffnd\" (UniqueName: \"kubernetes.io/projected/741ae386-433b-4991-a5af-03faa3c1a8cd-kube-api-access-fffnd\") on node \"crc\" DevicePath \"\"" Mar 09 10:30:04 crc kubenswrapper[4792]: I0309 10:30:04.079034 4792 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/741ae386-433b-4991-a5af-03faa3c1a8cd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 10:30:04 crc kubenswrapper[4792]: I0309 10:30:04.079047 4792 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/741ae386-433b-4991-a5af-03faa3c1a8cd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 10:30:04 crc kubenswrapper[4792]: I0309 10:30:04.466585 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" event={"ID":"741ae386-433b-4991-a5af-03faa3c1a8cd","Type":"ContainerDied","Data":"245aaee1e5888ee0471a6ce29a54a0050e98a5cd36991ff4556cbd0873b682c7"} Mar 09 10:30:04 crc kubenswrapper[4792]: I0309 10:30:04.466651 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="245aaee1e5888ee0471a6ce29a54a0050e98a5cd36991ff4556cbd0873b682c7" Mar 09 10:30:04 crc kubenswrapper[4792]: I0309 10:30:04.467758 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550870-s5k6b" Mar 09 10:30:04 crc kubenswrapper[4792]: I0309 10:30:04.469785 4792 generic.go:334] "Generic (PLEG): container finished" podID="c9a92e2f-33fd-44a9-89f2-2717e41e9d2a" containerID="a33ead7ebad13d3916124551f99a1f03902f6afa464fbe1200389b288814f98c" exitCode=0 Mar 09 10:30:04 crc kubenswrapper[4792]: I0309 10:30:04.469838 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550870-45r5t" event={"ID":"c9a92e2f-33fd-44a9-89f2-2717e41e9d2a","Type":"ContainerDied","Data":"a33ead7ebad13d3916124551f99a1f03902f6afa464fbe1200389b288814f98c"} Mar 09 10:30:04 crc kubenswrapper[4792]: I0309 10:30:04.534575 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k"] Mar 09 10:30:04 crc kubenswrapper[4792]: I0309 10:30:04.545327 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550825-c687k"] Mar 09 10:30:05 crc kubenswrapper[4792]: I0309 10:30:05.679131 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="850e1ee7-b846-45b3-97ff-b33b9a1c6c93" path="/var/lib/kubelet/pods/850e1ee7-b846-45b3-97ff-b33b9a1c6c93/volumes" Mar 09 10:30:05 crc kubenswrapper[4792]: I0309 10:30:05.788076 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550870-45r5t" Mar 09 10:30:05 crc kubenswrapper[4792]: I0309 10:30:05.812021 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mj4l\" (UniqueName: \"kubernetes.io/projected/c9a92e2f-33fd-44a9-89f2-2717e41e9d2a-kube-api-access-7mj4l\") pod \"c9a92e2f-33fd-44a9-89f2-2717e41e9d2a\" (UID: \"c9a92e2f-33fd-44a9-89f2-2717e41e9d2a\") " Mar 09 10:30:05 crc kubenswrapper[4792]: I0309 10:30:05.851968 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a92e2f-33fd-44a9-89f2-2717e41e9d2a-kube-api-access-7mj4l" (OuterVolumeSpecName: "kube-api-access-7mj4l") pod "c9a92e2f-33fd-44a9-89f2-2717e41e9d2a" (UID: "c9a92e2f-33fd-44a9-89f2-2717e41e9d2a"). InnerVolumeSpecName "kube-api-access-7mj4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:30:05 crc kubenswrapper[4792]: I0309 10:30:05.914272 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mj4l\" (UniqueName: \"kubernetes.io/projected/c9a92e2f-33fd-44a9-89f2-2717e41e9d2a-kube-api-access-7mj4l\") on node \"crc\" DevicePath \"\"" Mar 09 10:30:06 crc kubenswrapper[4792]: I0309 10:30:06.490345 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550870-45r5t" event={"ID":"c9a92e2f-33fd-44a9-89f2-2717e41e9d2a","Type":"ContainerDied","Data":"a42c34a5c11d10240a5918d42f592cea75705994d2c93b63fd6e1d17ab71202d"} Mar 09 10:30:06 crc kubenswrapper[4792]: I0309 10:30:06.490674 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a42c34a5c11d10240a5918d42f592cea75705994d2c93b63fd6e1d17ab71202d" Mar 09 10:30:06 crc kubenswrapper[4792]: I0309 10:30:06.490420 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550870-45r5t" Mar 09 10:30:06 crc kubenswrapper[4792]: I0309 10:30:06.843693 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550864-m5bjn"] Mar 09 10:30:06 crc kubenswrapper[4792]: I0309 10:30:06.852715 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550864-m5bjn"] Mar 09 10:30:07 crc kubenswrapper[4792]: I0309 10:30:07.674804 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef44af8-2b43-4cf4-8cf6-ce645992e33d" path="/var/lib/kubelet/pods/4ef44af8-2b43-4cf4-8cf6-ce645992e33d/volumes" Mar 09 10:30:13 crc kubenswrapper[4792]: I0309 10:30:13.214038 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:30:13 crc kubenswrapper[4792]: I0309 10:30:13.214422 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:30:26 crc kubenswrapper[4792]: I0309 10:30:26.836062 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-78f7c77b76-nw94r_2b4c9d79-a45e-457d-be41-ea8535f122c6/barbican-api/0.log" Mar 09 10:30:27 crc kubenswrapper[4792]: I0309 10:30:27.201461 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-78f7c77b76-nw94r_2b4c9d79-a45e-457d-be41-ea8535f122c6/barbican-api-log/0.log" Mar 09 10:30:27 crc kubenswrapper[4792]: I0309 10:30:27.261545 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f45884c58-b4trg_f7122066-5687-409a-9d80-f39f2d96ad84/barbican-keystone-listener/0.log" Mar 09 10:30:27 crc kubenswrapper[4792]: I0309 10:30:27.356586 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f45884c58-b4trg_f7122066-5687-409a-9d80-f39f2d96ad84/barbican-keystone-listener-log/0.log" Mar 09 10:30:27 crc kubenswrapper[4792]: I0309 10:30:27.524692 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-68bdcc9765-czvxc_7997db4c-9ed8-438f-86b3-558a6ed2be44/barbican-worker/0.log" Mar 09 10:30:27 crc kubenswrapper[4792]: I0309 10:30:27.641613 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-68bdcc9765-czvxc_7997db4c-9ed8-438f-86b3-558a6ed2be44/barbican-worker-log/0.log" Mar 09 10:30:27 crc kubenswrapper[4792]: I0309 10:30:27.761992 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-vndqm_90585516-71d1-4289-8f0d-43884caee227/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:27 crc kubenswrapper[4792]: I0309 10:30:27.952576 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a1a37964-5fdf-4a05-bde5-750f454d2987/ceilometer-central-agent/0.log" Mar 09 10:30:27 crc kubenswrapper[4792]: I0309 10:30:27.993794 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a1a37964-5fdf-4a05-bde5-750f454d2987/ceilometer-notification-agent/0.log" Mar 09 10:30:28 crc kubenswrapper[4792]: I0309 10:30:28.045118 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a1a37964-5fdf-4a05-bde5-750f454d2987/proxy-httpd/0.log" Mar 09 10:30:28 crc kubenswrapper[4792]: I0309 10:30:28.162403 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a1a37964-5fdf-4a05-bde5-750f454d2987/sg-core/0.log" Mar 09 10:30:28 crc kubenswrapper[4792]: I0309 10:30:28.287383 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-x9rrm_e8ade593-bf65-47e0-8be9-76c8fedc40a1/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:28 crc kubenswrapper[4792]: I0309 10:30:28.521684 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-ckfdh_ff236fbb-03e6-4227-b10c-9dfeac266de8/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:28 crc kubenswrapper[4792]: I0309 10:30:28.669372 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d0b2c9c1-464f-4058-aa55-ce041668d8a2/cinder-api/0.log" Mar 09 10:30:29 crc kubenswrapper[4792]: I0309 10:30:29.076856 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d0b2c9c1-464f-4058-aa55-ce041668d8a2/cinder-api-log/0.log" Mar 09 10:30:29 crc kubenswrapper[4792]: I0309 10:30:29.261131 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2eaee6b3-8397-430c-b799-7628762d1701/probe/0.log" Mar 09 10:30:29 crc kubenswrapper[4792]: I0309 10:30:29.454528 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_2eaee6b3-8397-430c-b799-7628762d1701/cinder-backup/0.log" Mar 09 10:30:29 crc kubenswrapper[4792]: I0309 10:30:29.492003 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c8c68c2d-fe77-41af-b4f4-8f83079bf316/cinder-scheduler/0.log" Mar 09 10:30:29 crc kubenswrapper[4792]: I0309 10:30:29.638009 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_c8c68c2d-fe77-41af-b4f4-8f83079bf316/probe/0.log" Mar 09 10:30:29 crc kubenswrapper[4792]: I0309 10:30:29.904105 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_61d989fe-045d-4c58-b660-f9d0e1a482f9/probe/0.log" Mar 09 10:30:29 crc kubenswrapper[4792]: I0309 10:30:29.934777 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_61d989fe-045d-4c58-b660-f9d0e1a482f9/cinder-volume/0.log" Mar 09 10:30:30 crc kubenswrapper[4792]: I0309 10:30:30.121469 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-nhjgt_1bedab89-65bc-478a-abf1-3e3429951e71/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:30 crc kubenswrapper[4792]: I0309 10:30:30.234501 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7r8kk_1bb0a43c-0d2a-4d09-8f65-4a9e9aa048c2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:30 crc kubenswrapper[4792]: I0309 10:30:30.419590 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-q4zw6_f4a082c9-9e44-4d1b-b361-3fe4af72fbe9/init/0.log" Mar 09 10:30:30 crc kubenswrapper[4792]: I0309 10:30:30.657766 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2fc0a824-2dfc-436e-ad6e-c0751afcb61f/glance-httpd/0.log" Mar 09 10:30:30 crc kubenswrapper[4792]: I0309 10:30:30.732391 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-q4zw6_f4a082c9-9e44-4d1b-b361-3fe4af72fbe9/init/0.log" Mar 09 10:30:30 crc kubenswrapper[4792]: I0309 10:30:30.826249 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-86c6bdcc4c-q4zw6_f4a082c9-9e44-4d1b-b361-3fe4af72fbe9/dnsmasq-dns/0.log" Mar 09 10:30:31 crc kubenswrapper[4792]: I0309 10:30:31.064610 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2fc0a824-2dfc-436e-ad6e-c0751afcb61f/glance-log/0.log" Mar 09 10:30:31 crc kubenswrapper[4792]: I0309 10:30:31.252882 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c/glance-log/0.log" Mar 09 10:30:31 crc kubenswrapper[4792]: I0309 10:30:31.293189 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f4e1727-ddeb-4d56-8fc8-005b7b9c1b3c/glance-httpd/0.log" Mar 09 10:30:31 crc kubenswrapper[4792]: I0309 10:30:31.517684 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54c85f748d-wxdlf_d028a70e-dd9d-4b38-bb18-4cd55cd002fe/horizon/1.log" Mar 09 10:30:31 crc kubenswrapper[4792]: I0309 10:30:31.781723 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54c85f748d-wxdlf_d028a70e-dd9d-4b38-bb18-4cd55cd002fe/horizon-log/0.log" Mar 09 10:30:31 crc kubenswrapper[4792]: I0309 10:30:31.826049 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-54c85f748d-wxdlf_d028a70e-dd9d-4b38-bb18-4cd55cd002fe/horizon/0.log" Mar 09 10:30:31 crc kubenswrapper[4792]: I0309 10:30:31.902420 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-7qdcq_9ff0f9ff-023a-4679-a084-1d4ae368e02d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:32 crc kubenswrapper[4792]: I0309 10:30:32.243798 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xwl2x_58614884-08cd-4ea5-b45e-45a6157f16aa/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:32 crc kubenswrapper[4792]: I0309 10:30:32.654855 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5b488b889c-ks9th_063f2c66-7712-4aff-a002-fccc2821c91a/keystone-api/0.log" Mar 09 10:30:32 crc kubenswrapper[4792]: I0309 10:30:32.704244 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29550841-fkrc7_b83ae2a5-e733-497b-a5de-56d3a962dec5/keystone-cron/0.log" Mar 09 10:30:33 crc kubenswrapper[4792]: I0309 10:30:33.332495 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_da658112-b6e4-4493-a46a-0add09e299f6/kube-state-metrics/0.log" Mar 09 10:30:33 crc kubenswrapper[4792]: I0309 10:30:33.379675 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-4sx54_047ab0a5-633d-4731-a534-fd2db3b65b43/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:33 crc kubenswrapper[4792]: I0309 10:30:33.607052 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_74c28e16-49a6-429a-9a95-ae4a07e9cb8e/manila-api-log/0.log" Mar 09 10:30:33 crc kubenswrapper[4792]: I0309 10:30:33.792310 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_74c28e16-49a6-429a-9a95-ae4a07e9cb8e/manila-api/0.log" Mar 09 10:30:33 crc kubenswrapper[4792]: I0309 10:30:33.839819 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_87ea7216-a1e4-47b3-8303-d2af1a68f974/probe/0.log" Mar 09 10:30:33 crc kubenswrapper[4792]: I0309 10:30:33.855442 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_87ea7216-a1e4-47b3-8303-d2af1a68f974/manila-scheduler/0.log" Mar 09 10:30:34 crc kubenswrapper[4792]: I0309 10:30:34.022655 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_49c2b09f-818f-461b-9ebd-bc43d6e268c6/probe/0.log" Mar 09 10:30:34 crc kubenswrapper[4792]: I0309 10:30:34.097523 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_49c2b09f-818f-461b-9ebd-bc43d6e268c6/manila-share/0.log" Mar 09 10:30:34 crc kubenswrapper[4792]: I0309 10:30:34.506878 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6cbbcf5c8f-spnsr_4ad68345-e440-498d-a525-014a7db81ea6/neutron-api/0.log" Mar 09 10:30:34 crc kubenswrapper[4792]: I0309 10:30:34.519885 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6cbbcf5c8f-spnsr_4ad68345-e440-498d-a525-014a7db81ea6/neutron-httpd/0.log" Mar 09 10:30:34 crc kubenswrapper[4792]: I0309 10:30:34.589627 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-fb88h_2e835834-d2ca-414a-b567-8364c4b208e5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:35 crc kubenswrapper[4792]: I0309 10:30:35.260656 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_882a66ed-9e4e-4501-90ab-a600db85728a/nova-api-log/0.log" Mar 09 10:30:35 crc kubenswrapper[4792]: I0309 10:30:35.472601 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_33f67afd-db61-4209-b505-8ec8edcabfc1/nova-cell0-conductor-conductor/0.log" Mar 09 10:30:35 crc kubenswrapper[4792]: I0309 10:30:35.703595 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_65141e32-9490-4e93-9338-c9878770172e/nova-cell1-conductor-conductor/0.log" Mar 09 10:30:35 crc kubenswrapper[4792]: I0309 10:30:35.987858 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8adff3b6-586a-445f-adee-c3f412b874c0/nova-cell1-novncproxy-novncproxy/0.log" Mar 09 10:30:36 crc kubenswrapper[4792]: I0309 10:30:36.051267 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_882a66ed-9e4e-4501-90ab-a600db85728a/nova-api-api/0.log" Mar 09 10:30:36 crc kubenswrapper[4792]: I0309 10:30:36.080407 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-fh9gr_50f74681-04e5-49c7-9d32-1e8841867bcb/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:36 crc kubenswrapper[4792]: I0309 10:30:36.397446 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ba905c80-a1c9-4e8b-9d19-965d91ffb934/nova-metadata-log/0.log" Mar 09 10:30:36 crc kubenswrapper[4792]: I0309 10:30:36.911928 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1731fe55-4bf2-4410-85f9-58124ed652c9/mysql-bootstrap/0.log" Mar 09 10:30:36 crc kubenswrapper[4792]: I0309 10:30:36.985633 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_de087a24-d54a-442c-8cbe-2cbe653c4343/nova-scheduler-scheduler/0.log" Mar 09 10:30:37 crc kubenswrapper[4792]: I0309 10:30:37.262438 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1731fe55-4bf2-4410-85f9-58124ed652c9/mysql-bootstrap/0.log" Mar 09 10:30:37 crc kubenswrapper[4792]: I0309 10:30:37.272777 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1731fe55-4bf2-4410-85f9-58124ed652c9/galera/0.log" Mar 09 10:30:37 crc kubenswrapper[4792]: I0309 10:30:37.538452 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7dd0ce66-42bf-4c00-8e99-3c58defcc87f/mysql-bootstrap/0.log" Mar 09 10:30:37 crc kubenswrapper[4792]: I0309 10:30:37.831859 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7dd0ce66-42bf-4c00-8e99-3c58defcc87f/galera/0.log" Mar 09 10:30:37 crc kubenswrapper[4792]: I0309 10:30:37.861350 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7dd0ce66-42bf-4c00-8e99-3c58defcc87f/mysql-bootstrap/0.log" Mar 09 10:30:38 crc kubenswrapper[4792]: I0309 10:30:38.138150 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_09fc64e5-4201-410d-a764-789e1dc85ac0/openstackclient/0.log" Mar 09 10:30:38 crc kubenswrapper[4792]: I0309 10:30:38.298031 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-kj9d8_438d928b-7565-4fe1-a005-2c6402835edf/ovn-controller/0.log" Mar 09 10:30:38 crc kubenswrapper[4792]: I0309 10:30:38.480376 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mxwdc_9b94bbb1-5f6b-40c1-96b1-66a228166d91/openstack-network-exporter/0.log" Mar 09 10:30:38 crc kubenswrapper[4792]: I0309 10:30:38.548225 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ba905c80-a1c9-4e8b-9d19-965d91ffb934/nova-metadata-metadata/0.log" Mar 09 10:30:38 crc kubenswrapper[4792]: I0309 10:30:38.781553 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gw65t_2fd40118-2613-4e01-a557-f7fc5f24e07c/ovsdb-server-init/0.log" Mar 09 10:30:38 crc kubenswrapper[4792]: I0309 10:30:38.993345 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gw65t_2fd40118-2613-4e01-a557-f7fc5f24e07c/ovsdb-server/0.log" Mar 09 10:30:39 crc kubenswrapper[4792]: I0309 10:30:39.069897 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gw65t_2fd40118-2613-4e01-a557-f7fc5f24e07c/ovsdb-server-init/0.log" Mar 09 10:30:39 crc kubenswrapper[4792]: I0309 10:30:39.111870 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gw65t_2fd40118-2613-4e01-a557-f7fc5f24e07c/ovs-vswitchd/0.log" Mar 09 10:30:39 crc kubenswrapper[4792]: I0309 10:30:39.321914 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_58b87887-c8d6-4658-9f0e-3d94f414c14c/openstack-network-exporter/0.log" Mar 09 10:30:39 crc kubenswrapper[4792]: I0309 10:30:39.359886 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7c2rh_c5a32778-1a93-440b-9f56-d0bded50a725/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:39 crc kubenswrapper[4792]: I0309 10:30:39.503674 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_58b87887-c8d6-4658-9f0e-3d94f414c14c/ovn-northd/0.log" Mar 09 10:30:39 crc kubenswrapper[4792]: I0309 10:30:39.683301 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_02a198ef-826d-49cf-a6c5-134da45ad28b/openstack-network-exporter/0.log" Mar 09 10:30:39 crc kubenswrapper[4792]: I0309 10:30:39.727729 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_02a198ef-826d-49cf-a6c5-134da45ad28b/ovsdbserver-nb/0.log" Mar 09 10:30:39 crc kubenswrapper[4792]: I0309 10:30:39.927921 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b99fdd60-0b01-4b3e-ad0b-0f32f7427f48/openstack-network-exporter/0.log" Mar 09 10:30:39 crc kubenswrapper[4792]: I0309 10:30:39.962136 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b99fdd60-0b01-4b3e-ad0b-0f32f7427f48/ovsdbserver-sb/0.log" Mar 09 10:30:40 crc kubenswrapper[4792]: I0309 10:30:40.374823 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d6ff87dd6-6wzmx_52f7c11a-3099-487b-9126-fd90d1db1aaa/placement-api/0.log" Mar 09 10:30:40 crc kubenswrapper[4792]: I0309 10:30:40.401631 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d6ff87dd6-6wzmx_52f7c11a-3099-487b-9126-fd90d1db1aaa/placement-log/0.log" Mar 09 10:30:40 crc kubenswrapper[4792]: I0309 10:30:40.424492 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a20da79f-1b2b-4d52-bf44-4c6a9bf0f210/setup-container/0.log" Mar 09 10:30:40 crc kubenswrapper[4792]: I0309 10:30:40.768868 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a20da79f-1b2b-4d52-bf44-4c6a9bf0f210/rabbitmq/0.log" Mar 09 10:30:40 crc kubenswrapper[4792]: I0309 10:30:40.810783 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a20da79f-1b2b-4d52-bf44-4c6a9bf0f210/setup-container/0.log" Mar 09 10:30:40 crc kubenswrapper[4792]: I0309 10:30:40.823189 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a994be4-9a88-4ee6-8e24-a6d62898f593/setup-container/0.log" Mar 09 10:30:41 crc kubenswrapper[4792]: I0309 10:30:41.224496 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a994be4-9a88-4ee6-8e24-a6d62898f593/rabbitmq/0.log" Mar 09 10:30:41 crc kubenswrapper[4792]: I0309 10:30:41.234655 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6a994be4-9a88-4ee6-8e24-a6d62898f593/setup-container/0.log" Mar 09 10:30:41 crc kubenswrapper[4792]: I0309 10:30:41.363352 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9b8w5_f84f1271-7155-48fc-a6f0-d1777cb75ac5/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:41 crc kubenswrapper[4792]: I0309 10:30:41.728824 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-fnr4l_bdb1fadb-a5cc-48c4-b3e4-77f0af224e2a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:41 crc kubenswrapper[4792]: I0309 10:30:41.742044 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-67gl6_afd25149-8416-4a5c-a84a-b63961a5e1f9/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:42 crc kubenswrapper[4792]: I0309 10:30:42.619181 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jzwbl_48ec7142-4c09-4a5f-8202-aaf16bb97b26/ssh-known-hosts-edpm-deployment/0.log" Mar 09 10:30:42 crc kubenswrapper[4792]: I0309 10:30:42.793390 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_152f601f-0625-4503-a057-26316d8504aa/tempest-tests-tempest-tests-runner/0.log" Mar 09 10:30:42 crc kubenswrapper[4792]: I0309 10:30:42.925953 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_69889849-285a-4c47-a955-3e681990d59e/test-operator-logs-container/0.log" Mar 09 10:30:43 crc kubenswrapper[4792]: I0309 10:30:43.116698 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-vwq86_c83b88c6-39ae-4077-84b9-e10f71a53d6e/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:30:43 crc kubenswrapper[4792]: I0309 10:30:43.213874 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:30:43 crc kubenswrapper[4792]: I0309 10:30:43.214197 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:30:43 crc kubenswrapper[4792]: I0309 10:30:43.214297 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 10:30:43 crc kubenswrapper[4792]: I0309 10:30:43.215114 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 10:30:43 crc kubenswrapper[4792]: I0309 10:30:43.215241 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" gracePeriod=600 Mar 09 10:30:43 crc kubenswrapper[4792]: E0309 10:30:43.355946 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:30:43 crc kubenswrapper[4792]: I0309 10:30:43.834696 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" exitCode=0 Mar 09 10:30:43 crc kubenswrapper[4792]: I0309 10:30:43.834753 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e"} Mar 09 10:30:43 crc kubenswrapper[4792]: I0309 10:30:43.836252 4792 scope.go:117] "RemoveContainer" containerID="6cb5e0b9a6c34e2e3e5787fefd658600bfed5b18ede479ce2bf2ac9c9b2e5218" Mar 09 10:30:43 crc kubenswrapper[4792]: I0309 10:30:43.836709 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:30:43 crc kubenswrapper[4792]: E0309 10:30:43.836949 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:30:45 crc kubenswrapper[4792]: I0309 10:30:45.557376 4792 scope.go:117] "RemoveContainer" containerID="cbc19e876a72df0f2d6103262413afd747151063a3bf2ac2163c9a2e003ee718" Mar 09 10:30:45 crc kubenswrapper[4792]: I0309 10:30:45.611420 4792 scope.go:117] "RemoveContainer" containerID="f6c982ddc371d3eabee8ddf084483a37a75db0598d6f15ca51a1770e356e7f7d" Mar 09 10:30:55 crc kubenswrapper[4792]: I0309 10:30:55.954687 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_22afdfd4-ea58-4efb-b316-bcb40c906952/memcached/0.log" Mar 09 10:30:56 crc kubenswrapper[4792]: I0309 10:30:56.663659 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:30:56 crc kubenswrapper[4792]: E0309 10:30:56.663972 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:31:10 crc kubenswrapper[4792]: I0309 10:31:10.662059 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:31:10 crc kubenswrapper[4792]: E0309 10:31:10.663105 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:31:17 crc kubenswrapper[4792]: I0309 10:31:17.325892 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-r5d6p_20b2fb83-c944-4553-b506-9ff3c9c199f5/manager/0.log" Mar 09 10:31:17 crc kubenswrapper[4792]: I0309 10:31:17.593130 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/util/0.log" Mar 09 10:31:18 crc kubenswrapper[4792]: I0309 10:31:18.308127 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/pull/0.log" Mar 09 10:31:18 crc kubenswrapper[4792]: I0309 10:31:18.343434 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/util/0.log" Mar 09 10:31:18 crc kubenswrapper[4792]: I0309 10:31:18.561460 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/pull/0.log" Mar 09 10:31:18 crc kubenswrapper[4792]: I0309 10:31:18.842479 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/pull/0.log" Mar 09 10:31:18 crc kubenswrapper[4792]: I0309 10:31:18.867661 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/util/0.log" Mar 09 10:31:19 crc kubenswrapper[4792]: I0309 10:31:19.083707 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfadtbcp_3412edec-dc99-4713-b6bf-cebdace9f6a6/extract/0.log" Mar 09 10:31:19 crc kubenswrapper[4792]: I0309 10:31:19.460728 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-tfz6b_b74999f3-cb46-4b35-a70f-71977b54d944/manager/0.log" Mar 09 10:31:19 crc kubenswrapper[4792]: I0309 10:31:19.505906 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-7fdsl_89b0f1f9-11f1-4d01-a2b8-ca2f1fae3bb2/manager/0.log" Mar 09 10:31:19 crc kubenswrapper[4792]: I0309 10:31:19.726424 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-lw2kp_b1140422-6cf3-4e92-95e2-6ea31179de28/manager/0.log" Mar 09 10:31:19 crc kubenswrapper[4792]: I0309 10:31:19.888127 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-jhmcx_98ba9a2a-30d6-45f2-af47-2994c292fe05/manager/0.log" Mar 09 10:31:20 crc kubenswrapper[4792]: I0309 10:31:20.308218 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-x7n9b_ac60ffe8-71d2-4ea1-bbc5-d377fc70d940/manager/0.log" Mar 09 10:31:20 crc kubenswrapper[4792]: I0309 10:31:20.668307 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-swtlr_fe547e1c-cb50-4541-b867-5154dae69ec3/manager/0.log" Mar 09 10:31:20 crc kubenswrapper[4792]: I0309 10:31:20.826855 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-dktrj_55f715a3-ef6e-40d8-9f9b-3100b2847b8d/manager/0.log" Mar 09 10:31:21 crc kubenswrapper[4792]: I0309 10:31:21.078549 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-4775c_2c678a62-a744-4384-8403-618b566ed91e/manager/0.log" Mar 09 10:31:21 crc kubenswrapper[4792]: I0309 10:31:21.225445 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-ckrbc_e27b7b35-b064-4e02-99e6-cb34af5ff0e9/manager/0.log" Mar 09 10:31:21 crc kubenswrapper[4792]: I0309 10:31:21.590110 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-5k4db_8fd39edc-ff27-4feb-b138-ee11a440c0ca/manager/0.log" Mar 09 10:31:21 crc kubenswrapper[4792]: I0309 10:31:21.915517 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-dpvjg_9063ee68-9840-4f35-8d4d-44ab947477d5/manager/0.log" Mar 09 10:31:21 crc kubenswrapper[4792]: I0309 10:31:21.960788 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-r44dt_c28488b2-919b-4307-9a70-b2f5f1280e2a/manager/0.log" Mar 09 10:31:22 crc kubenswrapper[4792]: I0309 10:31:22.316031 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64bd8c755s6r4_9ca7aa92-3367-4c2e-a86e-33ba41fe81cb/manager/0.log" Mar 09 10:31:22 crc kubenswrapper[4792]: I0309 10:31:22.469575 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-568b7cf6db-hz254_03eb7926-dd55-4d02-a695-5abcb5a02cdc/operator/0.log" Mar 09 10:31:22 crc kubenswrapper[4792]: I0309 10:31:22.911777 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rjx9b_93e20f26-20b1-409a-8663-61cd1a7a71d3/registry-server/0.log" Mar 09 10:31:23 crc kubenswrapper[4792]: I0309 10:31:23.183353 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-8vt8s_82689eba-1f75-4e2e-8c27-a5b90e2805af/manager/0.log" Mar 09 10:31:23 crc kubenswrapper[4792]: I0309 10:31:23.311679 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-z5tts_d53acf43-fee2-4bdf-9cdb-883641a56d48/manager/0.log" Mar 09 10:31:23 crc kubenswrapper[4792]: I0309 10:31:23.568511 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-kjtwp_92a6c902-5189-421e-b1a1-ed3e64e7bca4/operator/0.log" Mar 09 10:31:23 crc kubenswrapper[4792]: I0309 10:31:23.833413 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-vhf7p_533287c3-78f0-46ea-baa9-fafb1ce7615b/manager/0.log" Mar 09 10:31:23 crc kubenswrapper[4792]: I0309 10:31:23.918933 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-mzpqx_d4313901-b530-42e8-a975-d21aefbc0506/manager/0.log" Mar 09 10:31:24 crc kubenswrapper[4792]: I0309 10:31:24.117735 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-z4lgh_e56405f7-7121-4d52-b276-3feeddabd667/manager/0.log" Mar 09 10:31:24 crc kubenswrapper[4792]: I0309 10:31:24.329966 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-vj8ds_41f3c31e-77a7-4912-a933-04b32c0db0dc/manager/0.log" Mar 09 10:31:24 crc kubenswrapper[4792]: I0309 10:31:24.653698 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-59b6c9788f-qh4rf_e42c0d5f-7c0c-420f-a14b-59316b524101/manager/0.log" Mar 09 10:31:24 crc kubenswrapper[4792]: I0309 10:31:24.684821 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:31:24 crc kubenswrapper[4792]: E0309 10:31:24.697398 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:31:27 crc kubenswrapper[4792]: I0309 10:31:27.946522 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-9jfp7_d9dc8da2-0584-4db0-ad3a-f1c59c2f6028/manager/0.log" Mar 09 10:31:39 crc kubenswrapper[4792]: I0309 10:31:39.662502 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:31:39 crc kubenswrapper[4792]: E0309 10:31:39.664523 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:31:46 crc kubenswrapper[4792]: I0309 10:31:46.827660 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-fzxrs_f24bba0a-6535-4ad8-8aa7-86a71a268334/control-plane-machine-set-operator/0.log" Mar 09 10:31:47 crc kubenswrapper[4792]: I0309 10:31:47.434645 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-56b7z_fc2b2079-7189-4ca3-b398-2a1146b9c70f/machine-api-operator/0.log" Mar 09 10:31:47 crc kubenswrapper[4792]: I0309 10:31:47.477329 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-56b7z_fc2b2079-7189-4ca3-b398-2a1146b9c70f/kube-rbac-proxy/0.log" Mar 09 10:31:52 crc kubenswrapper[4792]: I0309 10:31:52.662103 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:31:52 crc kubenswrapper[4792]: E0309 10:31:52.662933 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.255565 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lksrt"] Mar 09 10:31:56 crc kubenswrapper[4792]: E0309 10:31:56.257944 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a92e2f-33fd-44a9-89f2-2717e41e9d2a" containerName="oc" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.257962 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a92e2f-33fd-44a9-89f2-2717e41e9d2a" containerName="oc" Mar 09 10:31:56 crc kubenswrapper[4792]: E0309 10:31:56.257973 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741ae386-433b-4991-a5af-03faa3c1a8cd" containerName="collect-profiles" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.257979 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="741ae386-433b-4991-a5af-03faa3c1a8cd" containerName="collect-profiles" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.258175 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a92e2f-33fd-44a9-89f2-2717e41e9d2a" containerName="oc" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.258201 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="741ae386-433b-4991-a5af-03faa3c1a8cd" containerName="collect-profiles" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.259475 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.292863 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lksrt"] Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.384113 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqclf\" (UniqueName: \"kubernetes.io/projected/743e780f-c75f-4407-a7ed-55fbdaeddb6a-kube-api-access-wqclf\") pod \"redhat-marketplace-lksrt\" (UID: \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\") " pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.384510 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743e780f-c75f-4407-a7ed-55fbdaeddb6a-catalog-content\") pod \"redhat-marketplace-lksrt\" (UID: \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\") " pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.384607 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743e780f-c75f-4407-a7ed-55fbdaeddb6a-utilities\") pod \"redhat-marketplace-lksrt\" (UID: \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\") " pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.486264 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqclf\" (UniqueName: \"kubernetes.io/projected/743e780f-c75f-4407-a7ed-55fbdaeddb6a-kube-api-access-wqclf\") pod \"redhat-marketplace-lksrt\" (UID: \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\") " pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.486461 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743e780f-c75f-4407-a7ed-55fbdaeddb6a-catalog-content\") pod \"redhat-marketplace-lksrt\" (UID: \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\") " pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.486578 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743e780f-c75f-4407-a7ed-55fbdaeddb6a-utilities\") pod \"redhat-marketplace-lksrt\" (UID: \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\") " pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.487130 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743e780f-c75f-4407-a7ed-55fbdaeddb6a-utilities\") pod \"redhat-marketplace-lksrt\" (UID: \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\") " pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.487133 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743e780f-c75f-4407-a7ed-55fbdaeddb6a-catalog-content\") pod \"redhat-marketplace-lksrt\" (UID: \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\") " pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.504535 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqclf\" (UniqueName: \"kubernetes.io/projected/743e780f-c75f-4407-a7ed-55fbdaeddb6a-kube-api-access-wqclf\") pod \"redhat-marketplace-lksrt\" (UID: \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\") " pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:31:56 crc kubenswrapper[4792]: I0309 10:31:56.583291 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:31:57 crc kubenswrapper[4792]: I0309 10:31:57.113568 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lksrt"] Mar 09 10:31:57 crc kubenswrapper[4792]: I0309 10:31:57.463166 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lksrt" event={"ID":"743e780f-c75f-4407-a7ed-55fbdaeddb6a","Type":"ContainerStarted","Data":"1449155cc902ebacbdb9ec67a766c8598d9c29204073c0ae534b4f80ef63bd1e"} Mar 09 10:31:58 crc kubenswrapper[4792]: I0309 10:31:58.473182 4792 generic.go:334] "Generic (PLEG): container finished" podID="743e780f-c75f-4407-a7ed-55fbdaeddb6a" containerID="f5412654f50d22186ce77e0808b5521b6e382eaa74bf57cd88c079e3f4d869f0" exitCode=0 Mar 09 10:31:58 crc kubenswrapper[4792]: I0309 10:31:58.473275 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lksrt" event={"ID":"743e780f-c75f-4407-a7ed-55fbdaeddb6a","Type":"ContainerDied","Data":"f5412654f50d22186ce77e0808b5521b6e382eaa74bf57cd88c079e3f4d869f0"} Mar 09 10:32:00 crc kubenswrapper[4792]: I0309 10:32:00.146469 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550872-jsdc5"] Mar 09 10:32:00 crc kubenswrapper[4792]: I0309 10:32:00.148605 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550872-jsdc5" Mar 09 10:32:00 crc kubenswrapper[4792]: I0309 10:32:00.153145 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:32:00 crc kubenswrapper[4792]: I0309 10:32:00.153490 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:32:00 crc kubenswrapper[4792]: I0309 10:32:00.153662 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:32:00 crc kubenswrapper[4792]: I0309 10:32:00.197804 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550872-jsdc5"] Mar 09 10:32:00 crc kubenswrapper[4792]: I0309 10:32:00.266850 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kktst\" (UniqueName: \"kubernetes.io/projected/a4ee8f8b-c971-47a8-ba9f-796809c0b326-kube-api-access-kktst\") pod \"auto-csr-approver-29550872-jsdc5\" (UID: \"a4ee8f8b-c971-47a8-ba9f-796809c0b326\") " pod="openshift-infra/auto-csr-approver-29550872-jsdc5" Mar 09 10:32:00 crc kubenswrapper[4792]: I0309 10:32:00.368710 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kktst\" (UniqueName: \"kubernetes.io/projected/a4ee8f8b-c971-47a8-ba9f-796809c0b326-kube-api-access-kktst\") pod \"auto-csr-approver-29550872-jsdc5\" (UID: \"a4ee8f8b-c971-47a8-ba9f-796809c0b326\") " pod="openshift-infra/auto-csr-approver-29550872-jsdc5" Mar 09 10:32:00 crc kubenswrapper[4792]: I0309 10:32:00.400989 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kktst\" (UniqueName: \"kubernetes.io/projected/a4ee8f8b-c971-47a8-ba9f-796809c0b326-kube-api-access-kktst\") pod \"auto-csr-approver-29550872-jsdc5\" (UID: \"a4ee8f8b-c971-47a8-ba9f-796809c0b326\") " pod="openshift-infra/auto-csr-approver-29550872-jsdc5" Mar 09 10:32:00 crc kubenswrapper[4792]: I0309 10:32:00.467773 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550872-jsdc5" Mar 09 10:32:00 crc kubenswrapper[4792]: I0309 10:32:00.490813 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lksrt" event={"ID":"743e780f-c75f-4407-a7ed-55fbdaeddb6a","Type":"ContainerStarted","Data":"6bfeccb49b55c2b8b80ef879fc5490ec69cc988d500f853c2af28dfcb39ba0be"} Mar 09 10:32:01 crc kubenswrapper[4792]: I0309 10:32:01.022642 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550872-jsdc5"] Mar 09 10:32:01 crc kubenswrapper[4792]: W0309 10:32:01.037357 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ee8f8b_c971_47a8_ba9f_796809c0b326.slice/crio-8dd6b7a3de9aac591476547f337ce3cb27df2820f17e9634846c0c39598feafe WatchSource:0}: Error finding container 8dd6b7a3de9aac591476547f337ce3cb27df2820f17e9634846c0c39598feafe: Status 404 returned error can't find the container with id 8dd6b7a3de9aac591476547f337ce3cb27df2820f17e9634846c0c39598feafe Mar 09 10:32:01 crc kubenswrapper[4792]: I0309 10:32:01.112583 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-72tj7_a6ad9459-0185-47b2-aebd-5a5a40554946/cert-manager-controller/0.log" Mar 09 10:32:01 crc kubenswrapper[4792]: I0309 10:32:01.308596 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-xfgbm_8c00ef29-8d91-4045-982c-8b4a6e98576b/cert-manager-cainjector/0.log" Mar 09 10:32:01 crc kubenswrapper[4792]: I0309 10:32:01.344525 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-7cft4_68e071be-fad9-4996-a83f-cd58058fe0f3/cert-manager-webhook/0.log" Mar 09 10:32:01 crc kubenswrapper[4792]: I0309 10:32:01.499881 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550872-jsdc5" event={"ID":"a4ee8f8b-c971-47a8-ba9f-796809c0b326","Type":"ContainerStarted","Data":"8dd6b7a3de9aac591476547f337ce3cb27df2820f17e9634846c0c39598feafe"} Mar 09 10:32:01 crc kubenswrapper[4792]: I0309 10:32:01.502143 4792 generic.go:334] "Generic (PLEG): container finished" podID="743e780f-c75f-4407-a7ed-55fbdaeddb6a" containerID="6bfeccb49b55c2b8b80ef879fc5490ec69cc988d500f853c2af28dfcb39ba0be" exitCode=0 Mar 09 10:32:01 crc kubenswrapper[4792]: I0309 10:32:01.502195 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lksrt" event={"ID":"743e780f-c75f-4407-a7ed-55fbdaeddb6a","Type":"ContainerDied","Data":"6bfeccb49b55c2b8b80ef879fc5490ec69cc988d500f853c2af28dfcb39ba0be"} Mar 09 10:32:02 crc kubenswrapper[4792]: I0309 10:32:02.518399 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550872-jsdc5" event={"ID":"a4ee8f8b-c971-47a8-ba9f-796809c0b326","Type":"ContainerStarted","Data":"936ce09d6ba3bc159fa27d63b41d82b1eb122725dd3932855fd75aa26ca13c86"} Mar 09 10:32:02 crc kubenswrapper[4792]: I0309 10:32:02.524897 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lksrt" event={"ID":"743e780f-c75f-4407-a7ed-55fbdaeddb6a","Type":"ContainerStarted","Data":"658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91"} Mar 09 10:32:02 crc kubenswrapper[4792]: I0309 10:32:02.538443 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550872-jsdc5" podStartSLOduration=1.683116809 podStartE2EDuration="2.538419145s" podCreationTimestamp="2026-03-09 10:32:00 +0000 UTC" firstStartedPulling="2026-03-09 10:32:01.03974231 +0000 UTC m=+5086.069943062" lastFinishedPulling="2026-03-09 10:32:01.895044646 +0000 UTC m=+5086.925245398" observedRunningTime="2026-03-09 10:32:02.535336468 +0000 UTC m=+5087.565537220" watchObservedRunningTime="2026-03-09 10:32:02.538419145 +0000 UTC m=+5087.568619917" Mar 09 10:32:02 crc kubenswrapper[4792]: I0309 10:32:02.563537 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lksrt" podStartSLOduration=2.885512867 podStartE2EDuration="6.563518158s" podCreationTimestamp="2026-03-09 10:31:56 +0000 UTC" firstStartedPulling="2026-03-09 10:31:58.475006524 +0000 UTC m=+5083.505207276" lastFinishedPulling="2026-03-09 10:32:02.153011815 +0000 UTC m=+5087.183212567" observedRunningTime="2026-03-09 10:32:02.561396278 +0000 UTC m=+5087.591597050" watchObservedRunningTime="2026-03-09 10:32:02.563518158 +0000 UTC m=+5087.593718910" Mar 09 10:32:03 crc kubenswrapper[4792]: I0309 10:32:03.535336 4792 generic.go:334] "Generic (PLEG): container finished" podID="a4ee8f8b-c971-47a8-ba9f-796809c0b326" containerID="936ce09d6ba3bc159fa27d63b41d82b1eb122725dd3932855fd75aa26ca13c86" exitCode=0 Mar 09 10:32:03 crc kubenswrapper[4792]: I0309 10:32:03.537199 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550872-jsdc5" event={"ID":"a4ee8f8b-c971-47a8-ba9f-796809c0b326","Type":"ContainerDied","Data":"936ce09d6ba3bc159fa27d63b41d82b1eb122725dd3932855fd75aa26ca13c86"} Mar 09 10:32:03 crc kubenswrapper[4792]: I0309 10:32:03.662954 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:32:03 crc kubenswrapper[4792]: E0309 10:32:03.663211 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:32:04 crc kubenswrapper[4792]: I0309 10:32:04.944913 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550872-jsdc5" Mar 09 10:32:05 crc kubenswrapper[4792]: I0309 10:32:05.066442 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kktst\" (UniqueName: \"kubernetes.io/projected/a4ee8f8b-c971-47a8-ba9f-796809c0b326-kube-api-access-kktst\") pod \"a4ee8f8b-c971-47a8-ba9f-796809c0b326\" (UID: \"a4ee8f8b-c971-47a8-ba9f-796809c0b326\") " Mar 09 10:32:05 crc kubenswrapper[4792]: I0309 10:32:05.072570 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ee8f8b-c971-47a8-ba9f-796809c0b326-kube-api-access-kktst" (OuterVolumeSpecName: "kube-api-access-kktst") pod "a4ee8f8b-c971-47a8-ba9f-796809c0b326" (UID: "a4ee8f8b-c971-47a8-ba9f-796809c0b326"). InnerVolumeSpecName "kube-api-access-kktst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:32:05 crc kubenswrapper[4792]: I0309 10:32:05.169516 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kktst\" (UniqueName: \"kubernetes.io/projected/a4ee8f8b-c971-47a8-ba9f-796809c0b326-kube-api-access-kktst\") on node \"crc\" DevicePath \"\"" Mar 09 10:32:05 crc kubenswrapper[4792]: I0309 10:32:05.560590 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550872-jsdc5" event={"ID":"a4ee8f8b-c971-47a8-ba9f-796809c0b326","Type":"ContainerDied","Data":"8dd6b7a3de9aac591476547f337ce3cb27df2820f17e9634846c0c39598feafe"} Mar 09 10:32:05 crc kubenswrapper[4792]: I0309 10:32:05.560662 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dd6b7a3de9aac591476547f337ce3cb27df2820f17e9634846c0c39598feafe" Mar 09 10:32:05 crc kubenswrapper[4792]: I0309 10:32:05.560677 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550872-jsdc5" Mar 09 10:32:05 crc kubenswrapper[4792]: I0309 10:32:05.622030 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550866-xvz8b"] Mar 09 10:32:05 crc kubenswrapper[4792]: I0309 10:32:05.638359 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550866-xvz8b"] Mar 09 10:32:05 crc kubenswrapper[4792]: I0309 10:32:05.685755 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecee766e-be60-4c51-a24e-88dccfa0c460" path="/var/lib/kubelet/pods/ecee766e-be60-4c51-a24e-88dccfa0c460/volumes" Mar 09 10:32:06 crc kubenswrapper[4792]: I0309 10:32:06.583503 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:32:06 crc kubenswrapper[4792]: I0309 10:32:06.583909 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:32:06 crc kubenswrapper[4792]: I0309 10:32:06.633411 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:32:08 crc kubenswrapper[4792]: I0309 10:32:08.094322 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:32:08 crc kubenswrapper[4792]: I0309 10:32:08.145441 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lksrt"] Mar 09 10:32:09 crc kubenswrapper[4792]: I0309 10:32:09.597941 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lksrt" podUID="743e780f-c75f-4407-a7ed-55fbdaeddb6a" containerName="registry-server" containerID="cri-o://658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91" gracePeriod=2 Mar 09 10:32:09 crc kubenswrapper[4792]: E0309 10:32:09.699513 4792 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod743e780f_c75f_4407_a7ed_55fbdaeddb6a.slice/crio-conmon-658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91.scope\": RecentStats: unable to find data in memory cache]" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.058486 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.129966 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743e780f-c75f-4407-a7ed-55fbdaeddb6a-utilities\") pod \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\" (UID: \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\") " Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.130033 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqclf\" (UniqueName: \"kubernetes.io/projected/743e780f-c75f-4407-a7ed-55fbdaeddb6a-kube-api-access-wqclf\") pod \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\" (UID: \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\") " Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.130100 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743e780f-c75f-4407-a7ed-55fbdaeddb6a-catalog-content\") pod \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\" (UID: \"743e780f-c75f-4407-a7ed-55fbdaeddb6a\") " Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.135972 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/743e780f-c75f-4407-a7ed-55fbdaeddb6a-kube-api-access-wqclf" (OuterVolumeSpecName: "kube-api-access-wqclf") pod "743e780f-c75f-4407-a7ed-55fbdaeddb6a" (UID: "743e780f-c75f-4407-a7ed-55fbdaeddb6a"). InnerVolumeSpecName "kube-api-access-wqclf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.147058 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/743e780f-c75f-4407-a7ed-55fbdaeddb6a-utilities" (OuterVolumeSpecName: "utilities") pod "743e780f-c75f-4407-a7ed-55fbdaeddb6a" (UID: "743e780f-c75f-4407-a7ed-55fbdaeddb6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.154336 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/743e780f-c75f-4407-a7ed-55fbdaeddb6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "743e780f-c75f-4407-a7ed-55fbdaeddb6a" (UID: "743e780f-c75f-4407-a7ed-55fbdaeddb6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.234021 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/743e780f-c75f-4407-a7ed-55fbdaeddb6a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.234548 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/743e780f-c75f-4407-a7ed-55fbdaeddb6a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.234688 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqclf\" (UniqueName: \"kubernetes.io/projected/743e780f-c75f-4407-a7ed-55fbdaeddb6a-kube-api-access-wqclf\") on node \"crc\" DevicePath \"\"" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.607663 4792 generic.go:334] "Generic (PLEG): container finished" podID="743e780f-c75f-4407-a7ed-55fbdaeddb6a" containerID="658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91" exitCode=0 Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.607712 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lksrt" event={"ID":"743e780f-c75f-4407-a7ed-55fbdaeddb6a","Type":"ContainerDied","Data":"658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91"} Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.607741 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lksrt" event={"ID":"743e780f-c75f-4407-a7ed-55fbdaeddb6a","Type":"ContainerDied","Data":"1449155cc902ebacbdb9ec67a766c8598d9c29204073c0ae534b4f80ef63bd1e"} Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.607760 4792 scope.go:117] "RemoveContainer" containerID="658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.607768 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lksrt" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.634218 4792 scope.go:117] "RemoveContainer" containerID="6bfeccb49b55c2b8b80ef879fc5490ec69cc988d500f853c2af28dfcb39ba0be" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.652293 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lksrt"] Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.667010 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lksrt"] Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.674041 4792 scope.go:117] "RemoveContainer" containerID="f5412654f50d22186ce77e0808b5521b6e382eaa74bf57cd88c079e3f4d869f0" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.713656 4792 scope.go:117] "RemoveContainer" containerID="658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91" Mar 09 10:32:10 crc kubenswrapper[4792]: E0309 10:32:10.714181 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91\": container with ID starting with 658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91 not found: ID does not exist" containerID="658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.714208 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91"} err="failed to get container status \"658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91\": rpc error: code = NotFound desc = could not find container \"658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91\": container with ID starting with 658d1044d1ef49cdc022dfa491b2199daa292a68d9419a93f47e834f078bfa91 not found: ID does not exist" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.714243 4792 scope.go:117] "RemoveContainer" containerID="6bfeccb49b55c2b8b80ef879fc5490ec69cc988d500f853c2af28dfcb39ba0be" Mar 09 10:32:10 crc kubenswrapper[4792]: E0309 10:32:10.714598 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bfeccb49b55c2b8b80ef879fc5490ec69cc988d500f853c2af28dfcb39ba0be\": container with ID starting with 6bfeccb49b55c2b8b80ef879fc5490ec69cc988d500f853c2af28dfcb39ba0be not found: ID does not exist" containerID="6bfeccb49b55c2b8b80ef879fc5490ec69cc988d500f853c2af28dfcb39ba0be" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.714618 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bfeccb49b55c2b8b80ef879fc5490ec69cc988d500f853c2af28dfcb39ba0be"} err="failed to get container status \"6bfeccb49b55c2b8b80ef879fc5490ec69cc988d500f853c2af28dfcb39ba0be\": rpc error: code = NotFound desc = could not find container \"6bfeccb49b55c2b8b80ef879fc5490ec69cc988d500f853c2af28dfcb39ba0be\": container with ID starting with 6bfeccb49b55c2b8b80ef879fc5490ec69cc988d500f853c2af28dfcb39ba0be not found: ID does not exist" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.714630 4792 scope.go:117] "RemoveContainer" containerID="f5412654f50d22186ce77e0808b5521b6e382eaa74bf57cd88c079e3f4d869f0" Mar 09 10:32:10 crc kubenswrapper[4792]: E0309 10:32:10.714995 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5412654f50d22186ce77e0808b5521b6e382eaa74bf57cd88c079e3f4d869f0\": container with ID starting with f5412654f50d22186ce77e0808b5521b6e382eaa74bf57cd88c079e3f4d869f0 not found: ID does not exist" containerID="f5412654f50d22186ce77e0808b5521b6e382eaa74bf57cd88c079e3f4d869f0" Mar 09 10:32:10 crc kubenswrapper[4792]: I0309 10:32:10.715048 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5412654f50d22186ce77e0808b5521b6e382eaa74bf57cd88c079e3f4d869f0"} err="failed to get container status \"f5412654f50d22186ce77e0808b5521b6e382eaa74bf57cd88c079e3f4d869f0\": rpc error: code = NotFound desc = could not find container \"f5412654f50d22186ce77e0808b5521b6e382eaa74bf57cd88c079e3f4d869f0\": container with ID starting with f5412654f50d22186ce77e0808b5521b6e382eaa74bf57cd88c079e3f4d869f0 not found: ID does not exist" Mar 09 10:32:11 crc kubenswrapper[4792]: I0309 10:32:11.672052 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="743e780f-c75f-4407-a7ed-55fbdaeddb6a" path="/var/lib/kubelet/pods/743e780f-c75f-4407-a7ed-55fbdaeddb6a/volumes" Mar 09 10:32:16 crc kubenswrapper[4792]: I0309 10:32:16.883483 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-5hgb5_dffb3a22-ee53-4b05-921e-bf92456a5518/nmstate-console-plugin/0.log" Mar 09 10:32:17 crc kubenswrapper[4792]: I0309 10:32:17.079679 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5bwpq_7370c580-bd4f-4659-8fe6-79d9f8b31c05/nmstate-handler/0.log" Mar 09 10:32:17 crc kubenswrapper[4792]: I0309 10:32:17.164414 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-rwpn2_0681a6fd-5531-4a3c-b2d8-59dfecd186c2/nmstate-metrics/0.log" Mar 09 10:32:17 crc kubenswrapper[4792]: I0309 10:32:17.184797 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-rwpn2_0681a6fd-5531-4a3c-b2d8-59dfecd186c2/kube-rbac-proxy/0.log" Mar 09 10:32:17 crc kubenswrapper[4792]: I0309 10:32:17.329898 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-rqfwt_c08f74f8-f8d6-48e8-bde0-9369c92969b0/nmstate-operator/0.log" Mar 09 10:32:17 crc kubenswrapper[4792]: I0309 10:32:17.452032 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-bsxt5_a8fdeb8b-8024-4916-b835-83a6da0b4ced/nmstate-webhook/0.log" Mar 09 10:32:17 crc kubenswrapper[4792]: I0309 10:32:17.662849 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:32:17 crc kubenswrapper[4792]: E0309 10:32:17.663165 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:32:30 crc kubenswrapper[4792]: I0309 10:32:30.662553 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:32:30 crc kubenswrapper[4792]: E0309 10:32:30.664266 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:32:42 crc kubenswrapper[4792]: I0309 10:32:42.662409 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:32:42 crc kubenswrapper[4792]: E0309 10:32:42.663201 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:32:45 crc kubenswrapper[4792]: I0309 10:32:45.646145 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-2lfdv_98a5f3b6-5d33-4542-9382-ea1d94e5f59f/kube-rbac-proxy/0.log" Mar 09 10:32:45 crc kubenswrapper[4792]: I0309 10:32:45.753013 4792 scope.go:117] "RemoveContainer" containerID="e925f019ff7525740362ab513ef045e02b3ccdf1c69383fd9402e922585067d5" Mar 09 10:32:45 crc kubenswrapper[4792]: I0309 10:32:45.829436 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-2lfdv_98a5f3b6-5d33-4542-9382-ea1d94e5f59f/controller/0.log" Mar 09 10:32:46 crc kubenswrapper[4792]: I0309 10:32:46.277879 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-frr-files/0.log" Mar 09 10:32:46 crc kubenswrapper[4792]: I0309 10:32:46.574279 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-frr-files/0.log" Mar 09 10:32:46 crc kubenswrapper[4792]: I0309 10:32:46.604735 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-metrics/0.log" Mar 09 10:32:46 crc kubenswrapper[4792]: I0309 10:32:46.649673 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-reloader/0.log" Mar 09 10:32:46 crc kubenswrapper[4792]: I0309 10:32:46.679722 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-reloader/0.log" Mar 09 10:32:46 crc kubenswrapper[4792]: I0309 10:32:46.862434 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-reloader/0.log" Mar 09 10:32:46 crc kubenswrapper[4792]: I0309 10:32:46.866156 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-frr-files/0.log" Mar 09 10:32:46 crc kubenswrapper[4792]: I0309 10:32:46.899036 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-metrics/0.log" Mar 09 10:32:46 crc kubenswrapper[4792]: I0309 10:32:46.907004 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-metrics/0.log" Mar 09 10:32:47 crc kubenswrapper[4792]: I0309 10:32:47.179946 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/controller/0.log" Mar 09 10:32:47 crc kubenswrapper[4792]: I0309 10:32:47.183016 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-metrics/0.log" Mar 09 10:32:47 crc kubenswrapper[4792]: I0309 10:32:47.191543 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-frr-files/0.log" Mar 09 10:32:47 crc kubenswrapper[4792]: I0309 10:32:47.237037 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/cp-reloader/0.log" Mar 09 10:32:47 crc kubenswrapper[4792]: I0309 10:32:47.399160 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/kube-rbac-proxy/0.log" Mar 09 10:32:47 crc kubenswrapper[4792]: I0309 10:32:47.419922 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/frr-metrics/0.log" Mar 09 10:32:47 crc kubenswrapper[4792]: I0309 10:32:47.469414 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/kube-rbac-proxy-frr/0.log" Mar 09 10:32:47 crc kubenswrapper[4792]: I0309 10:32:47.750817 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-gm7zm_7ac24411-eccd-496a-8a49-d9b552a92691/frr-k8s-webhook-server/0.log" Mar 09 10:32:47 crc kubenswrapper[4792]: I0309 10:32:47.767469 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/reloader/0.log" Mar 09 10:32:48 crc kubenswrapper[4792]: I0309 10:32:48.093054 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5d5f56c665-gxjds_5cb925f9-fcd8-47a5-8959-76bfdbbc2979/manager/0.log" Mar 09 10:32:48 crc kubenswrapper[4792]: I0309 10:32:48.312655 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6d4cf89d46-x6c57_491ea032-e688-454c-a67d-09966007bb7f/webhook-server/0.log" Mar 09 10:32:48 crc kubenswrapper[4792]: I0309 10:32:48.441738 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9fj2f_6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9/kube-rbac-proxy/0.log" Mar 09 10:32:49 crc kubenswrapper[4792]: I0309 10:32:49.171351 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9fj2f_6ed0a6a3-dfa2-49c2-bbb2-96a6f3cfc4f9/speaker/0.log" Mar 09 10:32:49 crc kubenswrapper[4792]: I0309 10:32:49.219976 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rll29_10a41b58-f88e-4dad-960f-cd70b006c3e7/frr/0.log" Mar 09 10:32:55 crc kubenswrapper[4792]: I0309 10:32:55.676904 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:32:55 crc kubenswrapper[4792]: E0309 10:32:55.678041 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:33:04 crc kubenswrapper[4792]: I0309 10:33:04.611919 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/util/0.log" Mar 09 10:33:04 crc kubenswrapper[4792]: I0309 10:33:04.920089 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/util/0.log" Mar 09 10:33:04 crc kubenswrapper[4792]: I0309 10:33:04.921893 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/pull/0.log" Mar 09 10:33:04 crc kubenswrapper[4792]: I0309 10:33:04.973754 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/pull/0.log" Mar 09 10:33:05 crc kubenswrapper[4792]: I0309 10:33:05.182652 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/util/0.log" Mar 09 10:33:05 crc kubenswrapper[4792]: I0309 10:33:05.215159 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/extract/0.log" Mar 09 10:33:05 crc kubenswrapper[4792]: I0309 10:33:05.225536 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82dsz92_95e345ef-d076-4754-b2e9-db935995c8c0/pull/0.log" Mar 09 10:33:05 crc kubenswrapper[4792]: I0309 10:33:05.434215 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/extract-utilities/0.log" Mar 09 10:33:05 crc kubenswrapper[4792]: I0309 10:33:05.688917 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/extract-content/0.log" Mar 09 10:33:05 crc kubenswrapper[4792]: I0309 10:33:05.703861 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/extract-content/0.log" Mar 09 10:33:05 crc kubenswrapper[4792]: I0309 10:33:05.712851 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/extract-utilities/0.log" Mar 09 10:33:06 crc kubenswrapper[4792]: I0309 10:33:06.255985 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/extract-utilities/0.log" Mar 09 10:33:06 crc kubenswrapper[4792]: I0309 10:33:06.439763 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/extract-content/0.log" Mar 09 10:33:06 crc kubenswrapper[4792]: I0309 10:33:06.581670 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/extract-utilities/0.log" Mar 09 10:33:06 crc kubenswrapper[4792]: I0309 10:33:06.915630 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/extract-content/0.log" Mar 09 10:33:06 crc kubenswrapper[4792]: I0309 10:33:06.980012 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/extract-utilities/0.log" Mar 09 10:33:06 crc kubenswrapper[4792]: I0309 10:33:06.990673 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/extract-content/0.log" Mar 09 10:33:07 crc kubenswrapper[4792]: I0309 10:33:07.405558 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/extract-content/0.log" Mar 09 10:33:07 crc kubenswrapper[4792]: I0309 10:33:07.456784 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/extract-utilities/0.log" Mar 09 10:33:07 crc kubenswrapper[4792]: I0309 10:33:07.656138 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mkwxh_2bce5f9c-c863-4962-a276-2b5a3a69def9/registry-server/0.log" Mar 09 10:33:07 crc kubenswrapper[4792]: I0309 10:33:07.788223 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-ks9xs_4f866ac7-cc92-4520-9d5e-0147cac097f2/registry-server/0.log" Mar 09 10:33:07 crc kubenswrapper[4792]: I0309 10:33:07.835062 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/util/0.log" Mar 09 10:33:08 crc kubenswrapper[4792]: I0309 10:33:08.103679 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/pull/0.log" Mar 09 10:33:08 crc kubenswrapper[4792]: I0309 10:33:08.143941 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/util/0.log" Mar 09 10:33:08 crc kubenswrapper[4792]: I0309 10:33:08.155302 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/pull/0.log" Mar 09 10:33:08 crc kubenswrapper[4792]: I0309 10:33:08.358932 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/util/0.log" Mar 09 10:33:08 crc kubenswrapper[4792]: I0309 10:33:08.440833 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/pull/0.log" Mar 09 10:33:08 crc kubenswrapper[4792]: I0309 10:33:08.543050 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4h5lz6_3419b911-375b-44c5-8be3-074ce9531ac5/extract/0.log" Mar 09 10:33:08 crc kubenswrapper[4792]: I0309 10:33:08.652345 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-m64ct_0589998d-961b-4184-9884-0ad5eee48348/marketplace-operator/0.log" Mar 09 10:33:08 crc kubenswrapper[4792]: I0309 10:33:08.662534 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:33:08 crc kubenswrapper[4792]: E0309 10:33:08.662816 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:33:08 crc kubenswrapper[4792]: I0309 10:33:08.778167 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/extract-utilities/0.log" Mar 09 10:33:09 crc kubenswrapper[4792]: I0309 10:33:09.003662 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/extract-content/0.log" Mar 09 10:33:09 crc kubenswrapper[4792]: I0309 10:33:09.457703 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/extract-content/0.log" Mar 09 10:33:09 crc kubenswrapper[4792]: I0309 10:33:09.472716 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/extract-utilities/0.log" Mar 09 10:33:09 crc kubenswrapper[4792]: I0309 10:33:09.685998 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/extract-content/0.log" Mar 09 10:33:09 crc kubenswrapper[4792]: I0309 10:33:09.831910 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/extract-utilities/0.log" Mar 09 10:33:10 crc kubenswrapper[4792]: I0309 10:33:10.026352 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/extract-utilities/0.log" Mar 09 10:33:10 crc kubenswrapper[4792]: I0309 10:33:10.291438 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/extract-utilities/0.log" Mar 09 10:33:10 crc kubenswrapper[4792]: I0309 10:33:10.309309 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/extract-content/0.log" Mar 09 10:33:10 crc kubenswrapper[4792]: I0309 10:33:10.326301 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/extract-content/0.log" Mar 09 10:33:10 crc kubenswrapper[4792]: I0309 10:33:10.370251 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-2rnrf_6ac8fc64-583a-420b-b356-cfa0491d9b6f/registry-server/0.log" Mar 09 10:33:10 crc kubenswrapper[4792]: I0309 10:33:10.580257 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/extract-utilities/0.log" Mar 09 10:33:10 crc kubenswrapper[4792]: I0309 10:33:10.612731 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/extract-content/0.log" Mar 09 10:33:11 crc kubenswrapper[4792]: I0309 10:33:11.190555 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ms9zz_a807ba61-7b14-443b-a870-6220b51d2bd6/registry-server/0.log" Mar 09 10:33:22 crc kubenswrapper[4792]: I0309 10:33:22.663813 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:33:22 crc kubenswrapper[4792]: E0309 10:33:22.665526 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:33:37 crc kubenswrapper[4792]: I0309 10:33:37.662456 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:33:37 crc kubenswrapper[4792]: E0309 10:33:37.663018 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:33:51 crc kubenswrapper[4792]: I0309 10:33:51.662392 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:33:51 crc kubenswrapper[4792]: E0309 10:33:51.663225 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.391545 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s5xlr"] Mar 09 10:33:58 crc kubenswrapper[4792]: E0309 10:33:58.393440 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743e780f-c75f-4407-a7ed-55fbdaeddb6a" containerName="extract-utilities" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.393519 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="743e780f-c75f-4407-a7ed-55fbdaeddb6a" containerName="extract-utilities" Mar 09 10:33:58 crc kubenswrapper[4792]: E0309 10:33:58.393594 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ee8f8b-c971-47a8-ba9f-796809c0b326" containerName="oc" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.393648 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ee8f8b-c971-47a8-ba9f-796809c0b326" containerName="oc" Mar 09 10:33:58 crc kubenswrapper[4792]: E0309 10:33:58.393701 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743e780f-c75f-4407-a7ed-55fbdaeddb6a" containerName="extract-content" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.393751 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="743e780f-c75f-4407-a7ed-55fbdaeddb6a" containerName="extract-content" Mar 09 10:33:58 crc kubenswrapper[4792]: E0309 10:33:58.393819 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743e780f-c75f-4407-a7ed-55fbdaeddb6a" containerName="registry-server" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.393897 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="743e780f-c75f-4407-a7ed-55fbdaeddb6a" containerName="registry-server" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.396858 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="743e780f-c75f-4407-a7ed-55fbdaeddb6a" containerName="registry-server" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.397045 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ee8f8b-c971-47a8-ba9f-796809c0b326" containerName="oc" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.398557 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.411373 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s5xlr"] Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.511257 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/351772c5-f0cc-4b95-887b-136d52231179-catalog-content\") pod \"redhat-operators-s5xlr\" (UID: \"351772c5-f0cc-4b95-887b-136d52231179\") " pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.511583 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csxbf\" (UniqueName: \"kubernetes.io/projected/351772c5-f0cc-4b95-887b-136d52231179-kube-api-access-csxbf\") pod \"redhat-operators-s5xlr\" (UID: \"351772c5-f0cc-4b95-887b-136d52231179\") " pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.511730 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/351772c5-f0cc-4b95-887b-136d52231179-utilities\") pod \"redhat-operators-s5xlr\" (UID: \"351772c5-f0cc-4b95-887b-136d52231179\") " pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.613916 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/351772c5-f0cc-4b95-887b-136d52231179-utilities\") pod \"redhat-operators-s5xlr\" (UID: \"351772c5-f0cc-4b95-887b-136d52231179\") " pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.614013 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/351772c5-f0cc-4b95-887b-136d52231179-catalog-content\") pod \"redhat-operators-s5xlr\" (UID: \"351772c5-f0cc-4b95-887b-136d52231179\") " pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.614208 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csxbf\" (UniqueName: \"kubernetes.io/projected/351772c5-f0cc-4b95-887b-136d52231179-kube-api-access-csxbf\") pod \"redhat-operators-s5xlr\" (UID: \"351772c5-f0cc-4b95-887b-136d52231179\") " pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.614462 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/351772c5-f0cc-4b95-887b-136d52231179-utilities\") pod \"redhat-operators-s5xlr\" (UID: \"351772c5-f0cc-4b95-887b-136d52231179\") " pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.614694 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/351772c5-f0cc-4b95-887b-136d52231179-catalog-content\") pod \"redhat-operators-s5xlr\" (UID: \"351772c5-f0cc-4b95-887b-136d52231179\") " pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.651920 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csxbf\" (UniqueName: \"kubernetes.io/projected/351772c5-f0cc-4b95-887b-136d52231179-kube-api-access-csxbf\") pod \"redhat-operators-s5xlr\" (UID: \"351772c5-f0cc-4b95-887b-136d52231179\") " pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:33:58 crc kubenswrapper[4792]: I0309 10:33:58.720327 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:33:59 crc kubenswrapper[4792]: I0309 10:33:59.336918 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s5xlr"] Mar 09 10:33:59 crc kubenswrapper[4792]: W0309 10:33:59.345189 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod351772c5_f0cc_4b95_887b_136d52231179.slice/crio-7c558ce5e26121bb91a64754a018a0429c68fbb14b408fd7551a0c4fa3db1c52 WatchSource:0}: Error finding container 7c558ce5e26121bb91a64754a018a0429c68fbb14b408fd7551a0c4fa3db1c52: Status 404 returned error can't find the container with id 7c558ce5e26121bb91a64754a018a0429c68fbb14b408fd7551a0c4fa3db1c52 Mar 09 10:33:59 crc kubenswrapper[4792]: I0309 10:33:59.557466 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5xlr" event={"ID":"351772c5-f0cc-4b95-887b-136d52231179","Type":"ContainerStarted","Data":"7c558ce5e26121bb91a64754a018a0429c68fbb14b408fd7551a0c4fa3db1c52"} Mar 09 10:34:00 crc kubenswrapper[4792]: I0309 10:34:00.159249 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550874-g2kqd"] Mar 09 10:34:00 crc kubenswrapper[4792]: I0309 10:34:00.160921 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550874-g2kqd" Mar 09 10:34:00 crc kubenswrapper[4792]: I0309 10:34:00.167741 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:34:00 crc kubenswrapper[4792]: I0309 10:34:00.168043 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:34:00 crc kubenswrapper[4792]: I0309 10:34:00.168211 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:34:00 crc kubenswrapper[4792]: I0309 10:34:00.189106 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550874-g2kqd"] Mar 09 10:34:00 crc kubenswrapper[4792]: I0309 10:34:00.250209 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ssdd\" (UniqueName: \"kubernetes.io/projected/d0cdb8fb-2438-4a06-a1de-953aefb12fe8-kube-api-access-2ssdd\") pod \"auto-csr-approver-29550874-g2kqd\" (UID: \"d0cdb8fb-2438-4a06-a1de-953aefb12fe8\") " pod="openshift-infra/auto-csr-approver-29550874-g2kqd" Mar 09 10:34:00 crc kubenswrapper[4792]: I0309 10:34:00.352240 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ssdd\" (UniqueName: \"kubernetes.io/projected/d0cdb8fb-2438-4a06-a1de-953aefb12fe8-kube-api-access-2ssdd\") pod \"auto-csr-approver-29550874-g2kqd\" (UID: \"d0cdb8fb-2438-4a06-a1de-953aefb12fe8\") " pod="openshift-infra/auto-csr-approver-29550874-g2kqd" Mar 09 10:34:00 crc kubenswrapper[4792]: I0309 10:34:00.378303 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ssdd\" (UniqueName: \"kubernetes.io/projected/d0cdb8fb-2438-4a06-a1de-953aefb12fe8-kube-api-access-2ssdd\") pod \"auto-csr-approver-29550874-g2kqd\" (UID: \"d0cdb8fb-2438-4a06-a1de-953aefb12fe8\") " pod="openshift-infra/auto-csr-approver-29550874-g2kqd" Mar 09 10:34:00 crc kubenswrapper[4792]: I0309 10:34:00.481827 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550874-g2kqd" Mar 09 10:34:00 crc kubenswrapper[4792]: I0309 10:34:00.580856 4792 generic.go:334] "Generic (PLEG): container finished" podID="351772c5-f0cc-4b95-887b-136d52231179" containerID="d702e185ef8b3c66ae80e3fe43430954c92d42a120bc984b026dd77017c8289f" exitCode=0 Mar 09 10:34:00 crc kubenswrapper[4792]: I0309 10:34:00.580903 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5xlr" event={"ID":"351772c5-f0cc-4b95-887b-136d52231179","Type":"ContainerDied","Data":"d702e185ef8b3c66ae80e3fe43430954c92d42a120bc984b026dd77017c8289f"} Mar 09 10:34:00 crc kubenswrapper[4792]: I0309 10:34:00.593317 4792 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 10:34:01 crc kubenswrapper[4792]: I0309 10:34:01.005906 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550874-g2kqd"] Mar 09 10:34:01 crc kubenswrapper[4792]: W0309 10:34:01.245856 4792 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0cdb8fb_2438_4a06_a1de_953aefb12fe8.slice/crio-0138d69995fa33dca2fb2dfdc20fabbd71f5d8ddcb4b7c04dda9cf400f90651d WatchSource:0}: Error finding container 0138d69995fa33dca2fb2dfdc20fabbd71f5d8ddcb4b7c04dda9cf400f90651d: Status 404 returned error can't find the container with id 0138d69995fa33dca2fb2dfdc20fabbd71f5d8ddcb4b7c04dda9cf400f90651d Mar 09 10:34:01 crc kubenswrapper[4792]: I0309 10:34:01.590193 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550874-g2kqd" event={"ID":"d0cdb8fb-2438-4a06-a1de-953aefb12fe8","Type":"ContainerStarted","Data":"0138d69995fa33dca2fb2dfdc20fabbd71f5d8ddcb4b7c04dda9cf400f90651d"} Mar 09 10:34:01 crc kubenswrapper[4792]: I0309 10:34:01.592015 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5xlr" event={"ID":"351772c5-f0cc-4b95-887b-136d52231179","Type":"ContainerStarted","Data":"9a2a3245c33f3f568bf087d377eb279f4f4fdabd51f7795acf939fd6d6fd5b61"} Mar 09 10:34:02 crc kubenswrapper[4792]: I0309 10:34:02.663994 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:34:02 crc kubenswrapper[4792]: E0309 10:34:02.664850 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:34:03 crc kubenswrapper[4792]: I0309 10:34:03.614246 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550874-g2kqd" event={"ID":"d0cdb8fb-2438-4a06-a1de-953aefb12fe8","Type":"ContainerStarted","Data":"d998a7e9fe8b0586025d22a231924b13dea2e365e2a321f746f2ea6af7e9e910"} Mar 09 10:34:03 crc kubenswrapper[4792]: I0309 10:34:03.634104 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550874-g2kqd" podStartSLOduration=2.6425255500000002 podStartE2EDuration="3.634084825s" podCreationTimestamp="2026-03-09 10:34:00 +0000 UTC" firstStartedPulling="2026-03-09 10:34:01.2492704 +0000 UTC m=+5206.279471162" lastFinishedPulling="2026-03-09 10:34:02.240829665 +0000 UTC m=+5207.271030437" observedRunningTime="2026-03-09 10:34:03.631104732 +0000 UTC m=+5208.661305484" watchObservedRunningTime="2026-03-09 10:34:03.634084825 +0000 UTC m=+5208.664285577" Mar 09 10:34:04 crc kubenswrapper[4792]: I0309 10:34:04.623507 4792 generic.go:334] "Generic (PLEG): container finished" podID="d0cdb8fb-2438-4a06-a1de-953aefb12fe8" containerID="d998a7e9fe8b0586025d22a231924b13dea2e365e2a321f746f2ea6af7e9e910" exitCode=0 Mar 09 10:34:04 crc kubenswrapper[4792]: I0309 10:34:04.623554 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550874-g2kqd" event={"ID":"d0cdb8fb-2438-4a06-a1de-953aefb12fe8","Type":"ContainerDied","Data":"d998a7e9fe8b0586025d22a231924b13dea2e365e2a321f746f2ea6af7e9e910"} Mar 09 10:34:06 crc kubenswrapper[4792]: I0309 10:34:06.173231 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550874-g2kqd" Mar 09 10:34:06 crc kubenswrapper[4792]: I0309 10:34:06.301356 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ssdd\" (UniqueName: \"kubernetes.io/projected/d0cdb8fb-2438-4a06-a1de-953aefb12fe8-kube-api-access-2ssdd\") pod \"d0cdb8fb-2438-4a06-a1de-953aefb12fe8\" (UID: \"d0cdb8fb-2438-4a06-a1de-953aefb12fe8\") " Mar 09 10:34:06 crc kubenswrapper[4792]: I0309 10:34:06.313578 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0cdb8fb-2438-4a06-a1de-953aefb12fe8-kube-api-access-2ssdd" (OuterVolumeSpecName: "kube-api-access-2ssdd") pod "d0cdb8fb-2438-4a06-a1de-953aefb12fe8" (UID: "d0cdb8fb-2438-4a06-a1de-953aefb12fe8"). InnerVolumeSpecName "kube-api-access-2ssdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:34:06 crc kubenswrapper[4792]: I0309 10:34:06.404890 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ssdd\" (UniqueName: \"kubernetes.io/projected/d0cdb8fb-2438-4a06-a1de-953aefb12fe8-kube-api-access-2ssdd\") on node \"crc\" DevicePath \"\"" Mar 09 10:34:06 crc kubenswrapper[4792]: I0309 10:34:06.645610 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550874-g2kqd" event={"ID":"d0cdb8fb-2438-4a06-a1de-953aefb12fe8","Type":"ContainerDied","Data":"0138d69995fa33dca2fb2dfdc20fabbd71f5d8ddcb4b7c04dda9cf400f90651d"} Mar 09 10:34:06 crc kubenswrapper[4792]: I0309 10:34:06.645646 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0138d69995fa33dca2fb2dfdc20fabbd71f5d8ddcb4b7c04dda9cf400f90651d" Mar 09 10:34:06 crc kubenswrapper[4792]: I0309 10:34:06.645922 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550874-g2kqd" Mar 09 10:34:06 crc kubenswrapper[4792]: I0309 10:34:06.733721 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550868-wzzr4"] Mar 09 10:34:06 crc kubenswrapper[4792]: I0309 10:34:06.743554 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550868-wzzr4"] Mar 09 10:34:07 crc kubenswrapper[4792]: I0309 10:34:07.656561 4792 generic.go:334] "Generic (PLEG): container finished" podID="351772c5-f0cc-4b95-887b-136d52231179" containerID="9a2a3245c33f3f568bf087d377eb279f4f4fdabd51f7795acf939fd6d6fd5b61" exitCode=0 Mar 09 10:34:07 crc kubenswrapper[4792]: I0309 10:34:07.656636 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5xlr" event={"ID":"351772c5-f0cc-4b95-887b-136d52231179","Type":"ContainerDied","Data":"9a2a3245c33f3f568bf087d377eb279f4f4fdabd51f7795acf939fd6d6fd5b61"} Mar 09 10:34:07 crc kubenswrapper[4792]: I0309 10:34:07.677376 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3bb20bd-9a4f-4227-9feb-ed8d6954fbda" path="/var/lib/kubelet/pods/d3bb20bd-9a4f-4227-9feb-ed8d6954fbda/volumes" Mar 09 10:34:08 crc kubenswrapper[4792]: I0309 10:34:08.667549 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5xlr" event={"ID":"351772c5-f0cc-4b95-887b-136d52231179","Type":"ContainerStarted","Data":"22ee8c1af09317099b33afb754395a319c6da225c0c04fc699e69be075a37276"} Mar 09 10:34:08 crc kubenswrapper[4792]: I0309 10:34:08.700993 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s5xlr" podStartSLOduration=3.239030444 podStartE2EDuration="10.700967724s" podCreationTimestamp="2026-03-09 10:33:58 +0000 UTC" firstStartedPulling="2026-03-09 10:34:00.592985441 +0000 UTC m=+5205.623186193" lastFinishedPulling="2026-03-09 10:34:08.054922721 +0000 UTC m=+5213.085123473" observedRunningTime="2026-03-09 10:34:08.694215205 +0000 UTC m=+5213.724415977" watchObservedRunningTime="2026-03-09 10:34:08.700967724 +0000 UTC m=+5213.731168476" Mar 09 10:34:08 crc kubenswrapper[4792]: I0309 10:34:08.721092 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:34:08 crc kubenswrapper[4792]: I0309 10:34:08.721146 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:34:09 crc kubenswrapper[4792]: I0309 10:34:09.769494 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s5xlr" podUID="351772c5-f0cc-4b95-887b-136d52231179" containerName="registry-server" probeResult="failure" output=< Mar 09 10:34:09 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:34:09 crc kubenswrapper[4792]: > Mar 09 10:34:13 crc kubenswrapper[4792]: I0309 10:34:13.664002 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:34:13 crc kubenswrapper[4792]: E0309 10:34:13.666417 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:34:19 crc kubenswrapper[4792]: I0309 10:34:19.779223 4792 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s5xlr" podUID="351772c5-f0cc-4b95-887b-136d52231179" containerName="registry-server" probeResult="failure" output=< Mar 09 10:34:19 crc kubenswrapper[4792]: timeout: failed to connect service ":50051" within 1s Mar 09 10:34:19 crc kubenswrapper[4792]: > Mar 09 10:34:25 crc kubenswrapper[4792]: I0309 10:34:25.669880 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:34:25 crc kubenswrapper[4792]: E0309 10:34:25.671255 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:34:28 crc kubenswrapper[4792]: I0309 10:34:28.767296 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:34:28 crc kubenswrapper[4792]: I0309 10:34:28.835231 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:34:29 crc kubenswrapper[4792]: I0309 10:34:29.598011 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s5xlr"] Mar 09 10:34:29 crc kubenswrapper[4792]: I0309 10:34:29.914506 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s5xlr" podUID="351772c5-f0cc-4b95-887b-136d52231179" containerName="registry-server" containerID="cri-o://22ee8c1af09317099b33afb754395a319c6da225c0c04fc699e69be075a37276" gracePeriod=2 Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.423953 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.489697 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/351772c5-f0cc-4b95-887b-136d52231179-catalog-content\") pod \"351772c5-f0cc-4b95-887b-136d52231179\" (UID: \"351772c5-f0cc-4b95-887b-136d52231179\") " Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.489791 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csxbf\" (UniqueName: \"kubernetes.io/projected/351772c5-f0cc-4b95-887b-136d52231179-kube-api-access-csxbf\") pod \"351772c5-f0cc-4b95-887b-136d52231179\" (UID: \"351772c5-f0cc-4b95-887b-136d52231179\") " Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.489856 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/351772c5-f0cc-4b95-887b-136d52231179-utilities\") pod \"351772c5-f0cc-4b95-887b-136d52231179\" (UID: \"351772c5-f0cc-4b95-887b-136d52231179\") " Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.490612 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/351772c5-f0cc-4b95-887b-136d52231179-utilities" (OuterVolumeSpecName: "utilities") pod "351772c5-f0cc-4b95-887b-136d52231179" (UID: "351772c5-f0cc-4b95-887b-136d52231179"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.495704 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/351772c5-f0cc-4b95-887b-136d52231179-kube-api-access-csxbf" (OuterVolumeSpecName: "kube-api-access-csxbf") pod "351772c5-f0cc-4b95-887b-136d52231179" (UID: "351772c5-f0cc-4b95-887b-136d52231179"). InnerVolumeSpecName "kube-api-access-csxbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.592106 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csxbf\" (UniqueName: \"kubernetes.io/projected/351772c5-f0cc-4b95-887b-136d52231179-kube-api-access-csxbf\") on node \"crc\" DevicePath \"\"" Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.592145 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/351772c5-f0cc-4b95-887b-136d52231179-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.633367 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/351772c5-f0cc-4b95-887b-136d52231179-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "351772c5-f0cc-4b95-887b-136d52231179" (UID: "351772c5-f0cc-4b95-887b-136d52231179"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.694258 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/351772c5-f0cc-4b95-887b-136d52231179-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.924965 4792 generic.go:334] "Generic (PLEG): container finished" podID="351772c5-f0cc-4b95-887b-136d52231179" containerID="22ee8c1af09317099b33afb754395a319c6da225c0c04fc699e69be075a37276" exitCode=0 Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.925036 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5xlr" event={"ID":"351772c5-f0cc-4b95-887b-136d52231179","Type":"ContainerDied","Data":"22ee8c1af09317099b33afb754395a319c6da225c0c04fc699e69be075a37276"} Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.925084 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5xlr" event={"ID":"351772c5-f0cc-4b95-887b-136d52231179","Type":"ContainerDied","Data":"7c558ce5e26121bb91a64754a018a0429c68fbb14b408fd7551a0c4fa3db1c52"} Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.925100 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s5xlr" Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.925102 4792 scope.go:117] "RemoveContainer" containerID="22ee8c1af09317099b33afb754395a319c6da225c0c04fc699e69be075a37276" Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.969719 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s5xlr"] Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.977217 4792 scope.go:117] "RemoveContainer" containerID="9a2a3245c33f3f568bf087d377eb279f4f4fdabd51f7795acf939fd6d6fd5b61" Mar 09 10:34:30 crc kubenswrapper[4792]: I0309 10:34:30.984223 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s5xlr"] Mar 09 10:34:31 crc kubenswrapper[4792]: I0309 10:34:31.268480 4792 scope.go:117] "RemoveContainer" containerID="d702e185ef8b3c66ae80e3fe43430954c92d42a120bc984b026dd77017c8289f" Mar 09 10:34:31 crc kubenswrapper[4792]: I0309 10:34:31.303927 4792 scope.go:117] "RemoveContainer" containerID="22ee8c1af09317099b33afb754395a319c6da225c0c04fc699e69be075a37276" Mar 09 10:34:31 crc kubenswrapper[4792]: E0309 10:34:31.304388 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22ee8c1af09317099b33afb754395a319c6da225c0c04fc699e69be075a37276\": container with ID starting with 22ee8c1af09317099b33afb754395a319c6da225c0c04fc699e69be075a37276 not found: ID does not exist" containerID="22ee8c1af09317099b33afb754395a319c6da225c0c04fc699e69be075a37276" Mar 09 10:34:31 crc kubenswrapper[4792]: I0309 10:34:31.304420 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ee8c1af09317099b33afb754395a319c6da225c0c04fc699e69be075a37276"} err="failed to get container status \"22ee8c1af09317099b33afb754395a319c6da225c0c04fc699e69be075a37276\": rpc error: code = NotFound desc = could not find container \"22ee8c1af09317099b33afb754395a319c6da225c0c04fc699e69be075a37276\": container with ID starting with 22ee8c1af09317099b33afb754395a319c6da225c0c04fc699e69be075a37276 not found: ID does not exist" Mar 09 10:34:31 crc kubenswrapper[4792]: I0309 10:34:31.304440 4792 scope.go:117] "RemoveContainer" containerID="9a2a3245c33f3f568bf087d377eb279f4f4fdabd51f7795acf939fd6d6fd5b61" Mar 09 10:34:31 crc kubenswrapper[4792]: E0309 10:34:31.304927 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a2a3245c33f3f568bf087d377eb279f4f4fdabd51f7795acf939fd6d6fd5b61\": container with ID starting with 9a2a3245c33f3f568bf087d377eb279f4f4fdabd51f7795acf939fd6d6fd5b61 not found: ID does not exist" containerID="9a2a3245c33f3f568bf087d377eb279f4f4fdabd51f7795acf939fd6d6fd5b61" Mar 09 10:34:31 crc kubenswrapper[4792]: I0309 10:34:31.304959 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a2a3245c33f3f568bf087d377eb279f4f4fdabd51f7795acf939fd6d6fd5b61"} err="failed to get container status \"9a2a3245c33f3f568bf087d377eb279f4f4fdabd51f7795acf939fd6d6fd5b61\": rpc error: code = NotFound desc = could not find container \"9a2a3245c33f3f568bf087d377eb279f4f4fdabd51f7795acf939fd6d6fd5b61\": container with ID starting with 9a2a3245c33f3f568bf087d377eb279f4f4fdabd51f7795acf939fd6d6fd5b61 not found: ID does not exist" Mar 09 10:34:31 crc kubenswrapper[4792]: I0309 10:34:31.304974 4792 scope.go:117] "RemoveContainer" containerID="d702e185ef8b3c66ae80e3fe43430954c92d42a120bc984b026dd77017c8289f" Mar 09 10:34:31 crc kubenswrapper[4792]: E0309 10:34:31.305361 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d702e185ef8b3c66ae80e3fe43430954c92d42a120bc984b026dd77017c8289f\": container with ID starting with d702e185ef8b3c66ae80e3fe43430954c92d42a120bc984b026dd77017c8289f not found: ID does not exist" containerID="d702e185ef8b3c66ae80e3fe43430954c92d42a120bc984b026dd77017c8289f" Mar 09 10:34:31 crc kubenswrapper[4792]: I0309 10:34:31.305404 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d702e185ef8b3c66ae80e3fe43430954c92d42a120bc984b026dd77017c8289f"} err="failed to get container status \"d702e185ef8b3c66ae80e3fe43430954c92d42a120bc984b026dd77017c8289f\": rpc error: code = NotFound desc = could not find container \"d702e185ef8b3c66ae80e3fe43430954c92d42a120bc984b026dd77017c8289f\": container with ID starting with d702e185ef8b3c66ae80e3fe43430954c92d42a120bc984b026dd77017c8289f not found: ID does not exist" Mar 09 10:34:31 crc kubenswrapper[4792]: I0309 10:34:31.674272 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="351772c5-f0cc-4b95-887b-136d52231179" path="/var/lib/kubelet/pods/351772c5-f0cc-4b95-887b-136d52231179/volumes" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.215290 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fqgqh"] Mar 09 10:34:37 crc kubenswrapper[4792]: E0309 10:34:37.216312 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0cdb8fb-2438-4a06-a1de-953aefb12fe8" containerName="oc" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.216326 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0cdb8fb-2438-4a06-a1de-953aefb12fe8" containerName="oc" Mar 09 10:34:37 crc kubenswrapper[4792]: E0309 10:34:37.216339 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="351772c5-f0cc-4b95-887b-136d52231179" containerName="extract-utilities" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.216347 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="351772c5-f0cc-4b95-887b-136d52231179" containerName="extract-utilities" Mar 09 10:34:37 crc kubenswrapper[4792]: E0309 10:34:37.216358 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="351772c5-f0cc-4b95-887b-136d52231179" containerName="registry-server" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.216365 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="351772c5-f0cc-4b95-887b-136d52231179" containerName="registry-server" Mar 09 10:34:37 crc kubenswrapper[4792]: E0309 10:34:37.216394 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="351772c5-f0cc-4b95-887b-136d52231179" containerName="extract-content" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.216399 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="351772c5-f0cc-4b95-887b-136d52231179" containerName="extract-content" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.216605 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0cdb8fb-2438-4a06-a1de-953aefb12fe8" containerName="oc" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.216622 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="351772c5-f0cc-4b95-887b-136d52231179" containerName="registry-server" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.250840 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.262758 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fqgqh"] Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.353741 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3502a15e-0592-4198-8259-324a606166d9-catalog-content\") pod \"certified-operators-fqgqh\" (UID: \"3502a15e-0592-4198-8259-324a606166d9\") " pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.353812 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3502a15e-0592-4198-8259-324a606166d9-utilities\") pod \"certified-operators-fqgqh\" (UID: \"3502a15e-0592-4198-8259-324a606166d9\") " pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.353833 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct6cd\" (UniqueName: \"kubernetes.io/projected/3502a15e-0592-4198-8259-324a606166d9-kube-api-access-ct6cd\") pod \"certified-operators-fqgqh\" (UID: \"3502a15e-0592-4198-8259-324a606166d9\") " pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.455366 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3502a15e-0592-4198-8259-324a606166d9-utilities\") pod \"certified-operators-fqgqh\" (UID: \"3502a15e-0592-4198-8259-324a606166d9\") " pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.455416 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct6cd\" (UniqueName: \"kubernetes.io/projected/3502a15e-0592-4198-8259-324a606166d9-kube-api-access-ct6cd\") pod \"certified-operators-fqgqh\" (UID: \"3502a15e-0592-4198-8259-324a606166d9\") " pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.455578 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3502a15e-0592-4198-8259-324a606166d9-catalog-content\") pod \"certified-operators-fqgqh\" (UID: \"3502a15e-0592-4198-8259-324a606166d9\") " pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.456108 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3502a15e-0592-4198-8259-324a606166d9-utilities\") pod \"certified-operators-fqgqh\" (UID: \"3502a15e-0592-4198-8259-324a606166d9\") " pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.456152 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3502a15e-0592-4198-8259-324a606166d9-catalog-content\") pod \"certified-operators-fqgqh\" (UID: \"3502a15e-0592-4198-8259-324a606166d9\") " pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.479915 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct6cd\" (UniqueName: \"kubernetes.io/projected/3502a15e-0592-4198-8259-324a606166d9-kube-api-access-ct6cd\") pod \"certified-operators-fqgqh\" (UID: \"3502a15e-0592-4198-8259-324a606166d9\") " pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:37 crc kubenswrapper[4792]: I0309 10:34:37.582485 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:38 crc kubenswrapper[4792]: I0309 10:34:38.230251 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fqgqh"] Mar 09 10:34:38 crc kubenswrapper[4792]: I0309 10:34:38.662147 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:34:38 crc kubenswrapper[4792]: E0309 10:34:38.662706 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:34:39 crc kubenswrapper[4792]: I0309 10:34:39.009418 4792 generic.go:334] "Generic (PLEG): container finished" podID="3502a15e-0592-4198-8259-324a606166d9" containerID="69b55bcd315c3f798e1d8476e12e30981667a75efebdd466440cfd9d5cabf29d" exitCode=0 Mar 09 10:34:39 crc kubenswrapper[4792]: I0309 10:34:39.009461 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fqgqh" event={"ID":"3502a15e-0592-4198-8259-324a606166d9","Type":"ContainerDied","Data":"69b55bcd315c3f798e1d8476e12e30981667a75efebdd466440cfd9d5cabf29d"} Mar 09 10:34:39 crc kubenswrapper[4792]: I0309 10:34:39.009488 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fqgqh" event={"ID":"3502a15e-0592-4198-8259-324a606166d9","Type":"ContainerStarted","Data":"fb8311d5962f8ea5a372b9816caa2565c71d980c921cbaac9f38a0f75d3028e8"} Mar 09 10:34:40 crc kubenswrapper[4792]: I0309 10:34:40.020885 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fqgqh" event={"ID":"3502a15e-0592-4198-8259-324a606166d9","Type":"ContainerStarted","Data":"5d294eddfa641a9e61a5dd7cbc1349f96eafa967fdba9e15f9d01f38da7f2d26"} Mar 09 10:34:42 crc kubenswrapper[4792]: I0309 10:34:42.046424 4792 generic.go:334] "Generic (PLEG): container finished" podID="3502a15e-0592-4198-8259-324a606166d9" containerID="5d294eddfa641a9e61a5dd7cbc1349f96eafa967fdba9e15f9d01f38da7f2d26" exitCode=0 Mar 09 10:34:42 crc kubenswrapper[4792]: I0309 10:34:42.046866 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fqgqh" event={"ID":"3502a15e-0592-4198-8259-324a606166d9","Type":"ContainerDied","Data":"5d294eddfa641a9e61a5dd7cbc1349f96eafa967fdba9e15f9d01f38da7f2d26"} Mar 09 10:34:43 crc kubenswrapper[4792]: I0309 10:34:43.058236 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fqgqh" event={"ID":"3502a15e-0592-4198-8259-324a606166d9","Type":"ContainerStarted","Data":"b0b7e2def3e46f11210c37fef9431d834b1854ce07a904c8c138f14184b09dcd"} Mar 09 10:34:43 crc kubenswrapper[4792]: I0309 10:34:43.088152 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fqgqh" podStartSLOduration=2.639036204 podStartE2EDuration="6.088133553s" podCreationTimestamp="2026-03-09 10:34:37 +0000 UTC" firstStartedPulling="2026-03-09 10:34:39.011541996 +0000 UTC m=+5244.041742748" lastFinishedPulling="2026-03-09 10:34:42.460639345 +0000 UTC m=+5247.490840097" observedRunningTime="2026-03-09 10:34:43.077708281 +0000 UTC m=+5248.107909043" watchObservedRunningTime="2026-03-09 10:34:43.088133553 +0000 UTC m=+5248.118334305" Mar 09 10:34:46 crc kubenswrapper[4792]: I0309 10:34:46.077148 4792 scope.go:117] "RemoveContainer" containerID="03c3857788da04320940b5453806946c3cb75df463a6cad9aa4982c36525bed1" Mar 09 10:34:47 crc kubenswrapper[4792]: I0309 10:34:47.582935 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:47 crc kubenswrapper[4792]: I0309 10:34:47.583649 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:47 crc kubenswrapper[4792]: I0309 10:34:47.635937 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:48 crc kubenswrapper[4792]: I0309 10:34:48.146216 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:48 crc kubenswrapper[4792]: I0309 10:34:48.207602 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fqgqh"] Mar 09 10:34:50 crc kubenswrapper[4792]: I0309 10:34:50.115959 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fqgqh" podUID="3502a15e-0592-4198-8259-324a606166d9" containerName="registry-server" containerID="cri-o://b0b7e2def3e46f11210c37fef9431d834b1854ce07a904c8c138f14184b09dcd" gracePeriod=2 Mar 09 10:34:50 crc kubenswrapper[4792]: I0309 10:34:50.683553 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:50 crc kubenswrapper[4792]: I0309 10:34:50.765486 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3502a15e-0592-4198-8259-324a606166d9-utilities\") pod \"3502a15e-0592-4198-8259-324a606166d9\" (UID: \"3502a15e-0592-4198-8259-324a606166d9\") " Mar 09 10:34:50 crc kubenswrapper[4792]: I0309 10:34:50.765548 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3502a15e-0592-4198-8259-324a606166d9-catalog-content\") pod \"3502a15e-0592-4198-8259-324a606166d9\" (UID: \"3502a15e-0592-4198-8259-324a606166d9\") " Mar 09 10:34:50 crc kubenswrapper[4792]: I0309 10:34:50.765663 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct6cd\" (UniqueName: \"kubernetes.io/projected/3502a15e-0592-4198-8259-324a606166d9-kube-api-access-ct6cd\") pod \"3502a15e-0592-4198-8259-324a606166d9\" (UID: \"3502a15e-0592-4198-8259-324a606166d9\") " Mar 09 10:34:50 crc kubenswrapper[4792]: I0309 10:34:50.773539 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3502a15e-0592-4198-8259-324a606166d9-kube-api-access-ct6cd" (OuterVolumeSpecName: "kube-api-access-ct6cd") pod "3502a15e-0592-4198-8259-324a606166d9" (UID: "3502a15e-0592-4198-8259-324a606166d9"). InnerVolumeSpecName "kube-api-access-ct6cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:34:50 crc kubenswrapper[4792]: I0309 10:34:50.781459 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3502a15e-0592-4198-8259-324a606166d9-utilities" (OuterVolumeSpecName: "utilities") pod "3502a15e-0592-4198-8259-324a606166d9" (UID: "3502a15e-0592-4198-8259-324a606166d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:34:50 crc kubenswrapper[4792]: I0309 10:34:50.868635 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct6cd\" (UniqueName: \"kubernetes.io/projected/3502a15e-0592-4198-8259-324a606166d9-kube-api-access-ct6cd\") on node \"crc\" DevicePath \"\"" Mar 09 10:34:50 crc kubenswrapper[4792]: I0309 10:34:50.868685 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3502a15e-0592-4198-8259-324a606166d9-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:34:50 crc kubenswrapper[4792]: I0309 10:34:50.915486 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3502a15e-0592-4198-8259-324a606166d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3502a15e-0592-4198-8259-324a606166d9" (UID: "3502a15e-0592-4198-8259-324a606166d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:34:50 crc kubenswrapper[4792]: I0309 10:34:50.970956 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3502a15e-0592-4198-8259-324a606166d9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.128847 4792 generic.go:334] "Generic (PLEG): container finished" podID="3502a15e-0592-4198-8259-324a606166d9" containerID="b0b7e2def3e46f11210c37fef9431d834b1854ce07a904c8c138f14184b09dcd" exitCode=0 Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.128891 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fqgqh" event={"ID":"3502a15e-0592-4198-8259-324a606166d9","Type":"ContainerDied","Data":"b0b7e2def3e46f11210c37fef9431d834b1854ce07a904c8c138f14184b09dcd"} Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.128923 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fqgqh" event={"ID":"3502a15e-0592-4198-8259-324a606166d9","Type":"ContainerDied","Data":"fb8311d5962f8ea5a372b9816caa2565c71d980c921cbaac9f38a0f75d3028e8"} Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.128945 4792 scope.go:117] "RemoveContainer" containerID="b0b7e2def3e46f11210c37fef9431d834b1854ce07a904c8c138f14184b09dcd" Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.128975 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fqgqh" Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.166766 4792 scope.go:117] "RemoveContainer" containerID="5d294eddfa641a9e61a5dd7cbc1349f96eafa967fdba9e15f9d01f38da7f2d26" Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.177499 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fqgqh"] Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.188060 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fqgqh"] Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.188554 4792 scope.go:117] "RemoveContainer" containerID="69b55bcd315c3f798e1d8476e12e30981667a75efebdd466440cfd9d5cabf29d" Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.243682 4792 scope.go:117] "RemoveContainer" containerID="b0b7e2def3e46f11210c37fef9431d834b1854ce07a904c8c138f14184b09dcd" Mar 09 10:34:51 crc kubenswrapper[4792]: E0309 10:34:51.244144 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0b7e2def3e46f11210c37fef9431d834b1854ce07a904c8c138f14184b09dcd\": container with ID starting with b0b7e2def3e46f11210c37fef9431d834b1854ce07a904c8c138f14184b09dcd not found: ID does not exist" containerID="b0b7e2def3e46f11210c37fef9431d834b1854ce07a904c8c138f14184b09dcd" Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.244261 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0b7e2def3e46f11210c37fef9431d834b1854ce07a904c8c138f14184b09dcd"} err="failed to get container status \"b0b7e2def3e46f11210c37fef9431d834b1854ce07a904c8c138f14184b09dcd\": rpc error: code = NotFound desc = could not find container \"b0b7e2def3e46f11210c37fef9431d834b1854ce07a904c8c138f14184b09dcd\": container with ID starting with b0b7e2def3e46f11210c37fef9431d834b1854ce07a904c8c138f14184b09dcd not found: ID does not exist" Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.244353 4792 scope.go:117] "RemoveContainer" containerID="5d294eddfa641a9e61a5dd7cbc1349f96eafa967fdba9e15f9d01f38da7f2d26" Mar 09 10:34:51 crc kubenswrapper[4792]: E0309 10:34:51.244765 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d294eddfa641a9e61a5dd7cbc1349f96eafa967fdba9e15f9d01f38da7f2d26\": container with ID starting with 5d294eddfa641a9e61a5dd7cbc1349f96eafa967fdba9e15f9d01f38da7f2d26 not found: ID does not exist" containerID="5d294eddfa641a9e61a5dd7cbc1349f96eafa967fdba9e15f9d01f38da7f2d26" Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.244863 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d294eddfa641a9e61a5dd7cbc1349f96eafa967fdba9e15f9d01f38da7f2d26"} err="failed to get container status \"5d294eddfa641a9e61a5dd7cbc1349f96eafa967fdba9e15f9d01f38da7f2d26\": rpc error: code = NotFound desc = could not find container \"5d294eddfa641a9e61a5dd7cbc1349f96eafa967fdba9e15f9d01f38da7f2d26\": container with ID starting with 5d294eddfa641a9e61a5dd7cbc1349f96eafa967fdba9e15f9d01f38da7f2d26 not found: ID does not exist" Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.244971 4792 scope.go:117] "RemoveContainer" containerID="69b55bcd315c3f798e1d8476e12e30981667a75efebdd466440cfd9d5cabf29d" Mar 09 10:34:51 crc kubenswrapper[4792]: E0309 10:34:51.245314 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69b55bcd315c3f798e1d8476e12e30981667a75efebdd466440cfd9d5cabf29d\": container with ID starting with 69b55bcd315c3f798e1d8476e12e30981667a75efebdd466440cfd9d5cabf29d not found: ID does not exist" containerID="69b55bcd315c3f798e1d8476e12e30981667a75efebdd466440cfd9d5cabf29d" Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.245346 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69b55bcd315c3f798e1d8476e12e30981667a75efebdd466440cfd9d5cabf29d"} err="failed to get container status \"69b55bcd315c3f798e1d8476e12e30981667a75efebdd466440cfd9d5cabf29d\": rpc error: code = NotFound desc = could not find container \"69b55bcd315c3f798e1d8476e12e30981667a75efebdd466440cfd9d5cabf29d\": container with ID starting with 69b55bcd315c3f798e1d8476e12e30981667a75efebdd466440cfd9d5cabf29d not found: ID does not exist" Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.664589 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:34:51 crc kubenswrapper[4792]: E0309 10:34:51.664863 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:34:51 crc kubenswrapper[4792]: I0309 10:34:51.676742 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3502a15e-0592-4198-8259-324a606166d9" path="/var/lib/kubelet/pods/3502a15e-0592-4198-8259-324a606166d9/volumes" Mar 09 10:35:05 crc kubenswrapper[4792]: I0309 10:35:05.669891 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:35:05 crc kubenswrapper[4792]: E0309 10:35:05.670828 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:35:16 crc kubenswrapper[4792]: I0309 10:35:16.662765 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:35:16 crc kubenswrapper[4792]: E0309 10:35:16.663694 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:35:28 crc kubenswrapper[4792]: I0309 10:35:28.663091 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:35:28 crc kubenswrapper[4792]: E0309 10:35:28.663805 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:35:41 crc kubenswrapper[4792]: I0309 10:35:41.662679 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:35:41 crc kubenswrapper[4792]: E0309 10:35:41.664053 4792 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-97tth_openshift-machine-config-operator(bd11045a-d746-4b42-872c-8b8d1dd2d515)\"" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" Mar 09 10:35:53 crc kubenswrapper[4792]: I0309 10:35:53.663168 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:35:53 crc kubenswrapper[4792]: I0309 10:35:53.678625 4792 generic.go:334] "Generic (PLEG): container finished" podID="c41c4e17-e600-497f-a883-b33a517f0b95" containerID="1e119ba893c108cc4b20ebb99f5dd976c7ef756cea0dbf47975c73c7e8024ca6" exitCode=0 Mar 09 10:35:53 crc kubenswrapper[4792]: I0309 10:35:53.678674 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8lmr2/must-gather-2ndc7" event={"ID":"c41c4e17-e600-497f-a883-b33a517f0b95","Type":"ContainerDied","Data":"1e119ba893c108cc4b20ebb99f5dd976c7ef756cea0dbf47975c73c7e8024ca6"} Mar 09 10:35:53 crc kubenswrapper[4792]: I0309 10:35:53.679401 4792 scope.go:117] "RemoveContainer" containerID="1e119ba893c108cc4b20ebb99f5dd976c7ef756cea0dbf47975c73c7e8024ca6" Mar 09 10:35:54 crc kubenswrapper[4792]: I0309 10:35:54.297463 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8lmr2_must-gather-2ndc7_c41c4e17-e600-497f-a883-b33a517f0b95/gather/0.log" Mar 09 10:35:54 crc kubenswrapper[4792]: I0309 10:35:54.694427 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"2c21502e8c989c9247d71918c93d06343710bcaa6779aed11b6fc4017e2ad0b4"} Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.166604 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550876-wd5b4"] Mar 09 10:36:00 crc kubenswrapper[4792]: E0309 10:36:00.167600 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3502a15e-0592-4198-8259-324a606166d9" containerName="extract-utilities" Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.167619 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3502a15e-0592-4198-8259-324a606166d9" containerName="extract-utilities" Mar 09 10:36:00 crc kubenswrapper[4792]: E0309 10:36:00.167647 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3502a15e-0592-4198-8259-324a606166d9" containerName="extract-content" Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.167655 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3502a15e-0592-4198-8259-324a606166d9" containerName="extract-content" Mar 09 10:36:00 crc kubenswrapper[4792]: E0309 10:36:00.167682 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3502a15e-0592-4198-8259-324a606166d9" containerName="registry-server" Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.167692 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3502a15e-0592-4198-8259-324a606166d9" containerName="registry-server" Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.167931 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3502a15e-0592-4198-8259-324a606166d9" containerName="registry-server" Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.168798 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550876-wd5b4" Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.172019 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.172257 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.172628 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.175045 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550876-wd5b4"] Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.290631 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jjmp\" (UniqueName: \"kubernetes.io/projected/bf12becc-2d76-4c75-a36b-4195e21beda4-kube-api-access-2jjmp\") pod \"auto-csr-approver-29550876-wd5b4\" (UID: \"bf12becc-2d76-4c75-a36b-4195e21beda4\") " pod="openshift-infra/auto-csr-approver-29550876-wd5b4" Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.392466 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jjmp\" (UniqueName: \"kubernetes.io/projected/bf12becc-2d76-4c75-a36b-4195e21beda4-kube-api-access-2jjmp\") pod \"auto-csr-approver-29550876-wd5b4\" (UID: \"bf12becc-2d76-4c75-a36b-4195e21beda4\") " pod="openshift-infra/auto-csr-approver-29550876-wd5b4" Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.426542 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jjmp\" (UniqueName: \"kubernetes.io/projected/bf12becc-2d76-4c75-a36b-4195e21beda4-kube-api-access-2jjmp\") pod \"auto-csr-approver-29550876-wd5b4\" (UID: \"bf12becc-2d76-4c75-a36b-4195e21beda4\") " pod="openshift-infra/auto-csr-approver-29550876-wd5b4" Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.494765 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550876-wd5b4" Mar 09 10:36:00 crc kubenswrapper[4792]: I0309 10:36:00.959881 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550876-wd5b4"] Mar 09 10:36:01 crc kubenswrapper[4792]: I0309 10:36:01.770905 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550876-wd5b4" event={"ID":"bf12becc-2d76-4c75-a36b-4195e21beda4","Type":"ContainerStarted","Data":"2ec6b1d625d898118045a29981d14fdbfaded95131607671c3835b12385a1ee8"} Mar 09 10:36:02 crc kubenswrapper[4792]: I0309 10:36:02.783732 4792 generic.go:334] "Generic (PLEG): container finished" podID="bf12becc-2d76-4c75-a36b-4195e21beda4" containerID="dd874fea26bb5057495cf14d72dc327e8f5dcf6eeacf5553be54d3c63c35ef21" exitCode=0 Mar 09 10:36:02 crc kubenswrapper[4792]: I0309 10:36:02.783852 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550876-wd5b4" event={"ID":"bf12becc-2d76-4c75-a36b-4195e21beda4","Type":"ContainerDied","Data":"dd874fea26bb5057495cf14d72dc327e8f5dcf6eeacf5553be54d3c63c35ef21"} Mar 09 10:36:04 crc kubenswrapper[4792]: I0309 10:36:04.156885 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550876-wd5b4" Mar 09 10:36:04 crc kubenswrapper[4792]: I0309 10:36:04.179728 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jjmp\" (UniqueName: \"kubernetes.io/projected/bf12becc-2d76-4c75-a36b-4195e21beda4-kube-api-access-2jjmp\") pod \"bf12becc-2d76-4c75-a36b-4195e21beda4\" (UID: \"bf12becc-2d76-4c75-a36b-4195e21beda4\") " Mar 09 10:36:04 crc kubenswrapper[4792]: I0309 10:36:04.190393 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf12becc-2d76-4c75-a36b-4195e21beda4-kube-api-access-2jjmp" (OuterVolumeSpecName: "kube-api-access-2jjmp") pod "bf12becc-2d76-4c75-a36b-4195e21beda4" (UID: "bf12becc-2d76-4c75-a36b-4195e21beda4"). InnerVolumeSpecName "kube-api-access-2jjmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:36:04 crc kubenswrapper[4792]: I0309 10:36:04.282585 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jjmp\" (UniqueName: \"kubernetes.io/projected/bf12becc-2d76-4c75-a36b-4195e21beda4-kube-api-access-2jjmp\") on node \"crc\" DevicePath \"\"" Mar 09 10:36:04 crc kubenswrapper[4792]: I0309 10:36:04.807462 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550876-wd5b4" event={"ID":"bf12becc-2d76-4c75-a36b-4195e21beda4","Type":"ContainerDied","Data":"2ec6b1d625d898118045a29981d14fdbfaded95131607671c3835b12385a1ee8"} Mar 09 10:36:04 crc kubenswrapper[4792]: I0309 10:36:04.807971 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ec6b1d625d898118045a29981d14fdbfaded95131607671c3835b12385a1ee8" Mar 09 10:36:04 crc kubenswrapper[4792]: I0309 10:36:04.807533 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550876-wd5b4" Mar 09 10:36:05 crc kubenswrapper[4792]: I0309 10:36:05.224890 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550870-45r5t"] Mar 09 10:36:05 crc kubenswrapper[4792]: I0309 10:36:05.233835 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550870-45r5t"] Mar 09 10:36:05 crc kubenswrapper[4792]: I0309 10:36:05.674563 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a92e2f-33fd-44a9-89f2-2717e41e9d2a" path="/var/lib/kubelet/pods/c9a92e2f-33fd-44a9-89f2-2717e41e9d2a/volumes" Mar 09 10:36:09 crc kubenswrapper[4792]: I0309 10:36:09.588303 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8lmr2/must-gather-2ndc7"] Mar 09 10:36:09 crc kubenswrapper[4792]: I0309 10:36:09.589150 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8lmr2/must-gather-2ndc7" podUID="c41c4e17-e600-497f-a883-b33a517f0b95" containerName="copy" containerID="cri-o://4fadced1f0075f8723094196c94d83ee39ace19176d43e5f3fb89dcbaa0284b7" gracePeriod=2 Mar 09 10:36:09 crc kubenswrapper[4792]: I0309 10:36:09.601582 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8lmr2/must-gather-2ndc7"] Mar 09 10:36:09 crc kubenswrapper[4792]: I0309 10:36:09.857585 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8lmr2_must-gather-2ndc7_c41c4e17-e600-497f-a883-b33a517f0b95/copy/0.log" Mar 09 10:36:09 crc kubenswrapper[4792]: I0309 10:36:09.858444 4792 generic.go:334] "Generic (PLEG): container finished" podID="c41c4e17-e600-497f-a883-b33a517f0b95" containerID="4fadced1f0075f8723094196c94d83ee39ace19176d43e5f3fb89dcbaa0284b7" exitCode=143 Mar 09 10:36:10 crc kubenswrapper[4792]: I0309 10:36:10.060374 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8lmr2_must-gather-2ndc7_c41c4e17-e600-497f-a883-b33a517f0b95/copy/0.log" Mar 09 10:36:10 crc kubenswrapper[4792]: I0309 10:36:10.060978 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/must-gather-2ndc7" Mar 09 10:36:10 crc kubenswrapper[4792]: I0309 10:36:10.102653 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c41c4e17-e600-497f-a883-b33a517f0b95-must-gather-output\") pod \"c41c4e17-e600-497f-a883-b33a517f0b95\" (UID: \"c41c4e17-e600-497f-a883-b33a517f0b95\") " Mar 09 10:36:10 crc kubenswrapper[4792]: I0309 10:36:10.103382 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6lbl\" (UniqueName: \"kubernetes.io/projected/c41c4e17-e600-497f-a883-b33a517f0b95-kube-api-access-j6lbl\") pod \"c41c4e17-e600-497f-a883-b33a517f0b95\" (UID: \"c41c4e17-e600-497f-a883-b33a517f0b95\") " Mar 09 10:36:10 crc kubenswrapper[4792]: I0309 10:36:10.140592 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41c4e17-e600-497f-a883-b33a517f0b95-kube-api-access-j6lbl" (OuterVolumeSpecName: "kube-api-access-j6lbl") pod "c41c4e17-e600-497f-a883-b33a517f0b95" (UID: "c41c4e17-e600-497f-a883-b33a517f0b95"). InnerVolumeSpecName "kube-api-access-j6lbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:36:10 crc kubenswrapper[4792]: I0309 10:36:10.207042 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6lbl\" (UniqueName: \"kubernetes.io/projected/c41c4e17-e600-497f-a883-b33a517f0b95-kube-api-access-j6lbl\") on node \"crc\" DevicePath \"\"" Mar 09 10:36:10 crc kubenswrapper[4792]: I0309 10:36:10.343937 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41c4e17-e600-497f-a883-b33a517f0b95-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c41c4e17-e600-497f-a883-b33a517f0b95" (UID: "c41c4e17-e600-497f-a883-b33a517f0b95"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:36:10 crc kubenswrapper[4792]: I0309 10:36:10.411100 4792 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c41c4e17-e600-497f-a883-b33a517f0b95-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 09 10:36:10 crc kubenswrapper[4792]: I0309 10:36:10.868134 4792 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8lmr2_must-gather-2ndc7_c41c4e17-e600-497f-a883-b33a517f0b95/copy/0.log" Mar 09 10:36:10 crc kubenswrapper[4792]: I0309 10:36:10.869765 4792 scope.go:117] "RemoveContainer" containerID="4fadced1f0075f8723094196c94d83ee39ace19176d43e5f3fb89dcbaa0284b7" Mar 09 10:36:10 crc kubenswrapper[4792]: I0309 10:36:10.869829 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8lmr2/must-gather-2ndc7" Mar 09 10:36:10 crc kubenswrapper[4792]: I0309 10:36:10.891839 4792 scope.go:117] "RemoveContainer" containerID="1e119ba893c108cc4b20ebb99f5dd976c7ef756cea0dbf47975c73c7e8024ca6" Mar 09 10:36:11 crc kubenswrapper[4792]: I0309 10:36:11.672424 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41c4e17-e600-497f-a883-b33a517f0b95" path="/var/lib/kubelet/pods/c41c4e17-e600-497f-a883-b33a517f0b95/volumes" Mar 09 10:36:46 crc kubenswrapper[4792]: I0309 10:36:46.203364 4792 scope.go:117] "RemoveContainer" containerID="a33ead7ebad13d3916124551f99a1f03902f6afa464fbe1200389b288814f98c" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.322183 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7w48j"] Mar 09 10:37:29 crc kubenswrapper[4792]: E0309 10:37:29.332623 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf12becc-2d76-4c75-a36b-4195e21beda4" containerName="oc" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.332662 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf12becc-2d76-4c75-a36b-4195e21beda4" containerName="oc" Mar 09 10:37:29 crc kubenswrapper[4792]: E0309 10:37:29.332678 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41c4e17-e600-497f-a883-b33a517f0b95" containerName="gather" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.332684 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41c4e17-e600-497f-a883-b33a517f0b95" containerName="gather" Mar 09 10:37:29 crc kubenswrapper[4792]: E0309 10:37:29.332701 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41c4e17-e600-497f-a883-b33a517f0b95" containerName="copy" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.332710 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41c4e17-e600-497f-a883-b33a517f0b95" containerName="copy" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.332913 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41c4e17-e600-497f-a883-b33a517f0b95" containerName="copy" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.332944 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41c4e17-e600-497f-a883-b33a517f0b95" containerName="gather" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.332960 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf12becc-2d76-4c75-a36b-4195e21beda4" containerName="oc" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.334525 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.339356 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7w48j"] Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.506552 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fda88c3-e0ac-4865-985c-ca72764f2ffd-utilities\") pod \"community-operators-7w48j\" (UID: \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\") " pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.506619 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgqst\" (UniqueName: \"kubernetes.io/projected/3fda88c3-e0ac-4865-985c-ca72764f2ffd-kube-api-access-hgqst\") pod \"community-operators-7w48j\" (UID: \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\") " pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.506719 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fda88c3-e0ac-4865-985c-ca72764f2ffd-catalog-content\") pod \"community-operators-7w48j\" (UID: \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\") " pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.608774 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fda88c3-e0ac-4865-985c-ca72764f2ffd-utilities\") pod \"community-operators-7w48j\" (UID: \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\") " pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.609162 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgqst\" (UniqueName: \"kubernetes.io/projected/3fda88c3-e0ac-4865-985c-ca72764f2ffd-kube-api-access-hgqst\") pod \"community-operators-7w48j\" (UID: \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\") " pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.609320 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fda88c3-e0ac-4865-985c-ca72764f2ffd-utilities\") pod \"community-operators-7w48j\" (UID: \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\") " pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.609469 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fda88c3-e0ac-4865-985c-ca72764f2ffd-catalog-content\") pod \"community-operators-7w48j\" (UID: \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\") " pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.609804 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fda88c3-e0ac-4865-985c-ca72764f2ffd-catalog-content\") pod \"community-operators-7w48j\" (UID: \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\") " pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.634887 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgqst\" (UniqueName: \"kubernetes.io/projected/3fda88c3-e0ac-4865-985c-ca72764f2ffd-kube-api-access-hgqst\") pod \"community-operators-7w48j\" (UID: \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\") " pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:29 crc kubenswrapper[4792]: I0309 10:37:29.659540 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:30 crc kubenswrapper[4792]: I0309 10:37:30.255004 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7w48j"] Mar 09 10:37:30 crc kubenswrapper[4792]: I0309 10:37:30.529185 4792 generic.go:334] "Generic (PLEG): container finished" podID="3fda88c3-e0ac-4865-985c-ca72764f2ffd" containerID="99d58a340b635cca2405b8523cd89103f26e30870a73391cb8597ce1ff714472" exitCode=0 Mar 09 10:37:30 crc kubenswrapper[4792]: I0309 10:37:30.529458 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w48j" event={"ID":"3fda88c3-e0ac-4865-985c-ca72764f2ffd","Type":"ContainerDied","Data":"99d58a340b635cca2405b8523cd89103f26e30870a73391cb8597ce1ff714472"} Mar 09 10:37:30 crc kubenswrapper[4792]: I0309 10:37:30.529494 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w48j" event={"ID":"3fda88c3-e0ac-4865-985c-ca72764f2ffd","Type":"ContainerStarted","Data":"65c85af208740c890125c04fc204c76bb976ad3b20d55889f57a440363d882bf"} Mar 09 10:37:31 crc kubenswrapper[4792]: I0309 10:37:31.558281 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w48j" event={"ID":"3fda88c3-e0ac-4865-985c-ca72764f2ffd","Type":"ContainerStarted","Data":"a6ddf6abb67b833be83e0b9c8fe1d6a6e77d8f40b9bd74a0ca911cd0bc60e490"} Mar 09 10:37:32 crc kubenswrapper[4792]: I0309 10:37:32.567633 4792 generic.go:334] "Generic (PLEG): container finished" podID="3fda88c3-e0ac-4865-985c-ca72764f2ffd" containerID="a6ddf6abb67b833be83e0b9c8fe1d6a6e77d8f40b9bd74a0ca911cd0bc60e490" exitCode=0 Mar 09 10:37:32 crc kubenswrapper[4792]: I0309 10:37:32.567695 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w48j" event={"ID":"3fda88c3-e0ac-4865-985c-ca72764f2ffd","Type":"ContainerDied","Data":"a6ddf6abb67b833be83e0b9c8fe1d6a6e77d8f40b9bd74a0ca911cd0bc60e490"} Mar 09 10:37:33 crc kubenswrapper[4792]: I0309 10:37:33.582792 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w48j" event={"ID":"3fda88c3-e0ac-4865-985c-ca72764f2ffd","Type":"ContainerStarted","Data":"e04a7bdc804e46b240e05ebc85432e2124c91751ae6615f1978d582a4e5e2d88"} Mar 09 10:37:33 crc kubenswrapper[4792]: I0309 10:37:33.609996 4792 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7w48j" podStartSLOduration=2.172516629 podStartE2EDuration="4.609969667s" podCreationTimestamp="2026-03-09 10:37:29 +0000 UTC" firstStartedPulling="2026-03-09 10:37:30.532708498 +0000 UTC m=+5415.562909250" lastFinishedPulling="2026-03-09 10:37:32.970161536 +0000 UTC m=+5418.000362288" observedRunningTime="2026-03-09 10:37:33.601545139 +0000 UTC m=+5418.631745901" watchObservedRunningTime="2026-03-09 10:37:33.609969667 +0000 UTC m=+5418.640170429" Mar 09 10:37:39 crc kubenswrapper[4792]: I0309 10:37:39.660765 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:39 crc kubenswrapper[4792]: I0309 10:37:39.661421 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:39 crc kubenswrapper[4792]: I0309 10:37:39.714838 4792 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:40 crc kubenswrapper[4792]: I0309 10:37:40.698317 4792 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:40 crc kubenswrapper[4792]: I0309 10:37:40.751217 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7w48j"] Mar 09 10:37:42 crc kubenswrapper[4792]: I0309 10:37:42.665002 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7w48j" podUID="3fda88c3-e0ac-4865-985c-ca72764f2ffd" containerName="registry-server" containerID="cri-o://e04a7bdc804e46b240e05ebc85432e2124c91751ae6615f1978d582a4e5e2d88" gracePeriod=2 Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.622902 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.681212 4792 generic.go:334] "Generic (PLEG): container finished" podID="3fda88c3-e0ac-4865-985c-ca72764f2ffd" containerID="e04a7bdc804e46b240e05ebc85432e2124c91751ae6615f1978d582a4e5e2d88" exitCode=0 Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.681269 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w48j" event={"ID":"3fda88c3-e0ac-4865-985c-ca72764f2ffd","Type":"ContainerDied","Data":"e04a7bdc804e46b240e05ebc85432e2124c91751ae6615f1978d582a4e5e2d88"} Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.681292 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w48j" event={"ID":"3fda88c3-e0ac-4865-985c-ca72764f2ffd","Type":"ContainerDied","Data":"65c85af208740c890125c04fc204c76bb976ad3b20d55889f57a440363d882bf"} Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.681309 4792 scope.go:117] "RemoveContainer" containerID="e04a7bdc804e46b240e05ebc85432e2124c91751ae6615f1978d582a4e5e2d88" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.681430 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7w48j" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.699263 4792 scope.go:117] "RemoveContainer" containerID="a6ddf6abb67b833be83e0b9c8fe1d6a6e77d8f40b9bd74a0ca911cd0bc60e490" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.723203 4792 scope.go:117] "RemoveContainer" containerID="99d58a340b635cca2405b8523cd89103f26e30870a73391cb8597ce1ff714472" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.756785 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fda88c3-e0ac-4865-985c-ca72764f2ffd-utilities\") pod \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\" (UID: \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\") " Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.756851 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgqst\" (UniqueName: \"kubernetes.io/projected/3fda88c3-e0ac-4865-985c-ca72764f2ffd-kube-api-access-hgqst\") pod \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\" (UID: \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\") " Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.756971 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fda88c3-e0ac-4865-985c-ca72764f2ffd-catalog-content\") pod \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\" (UID: \"3fda88c3-e0ac-4865-985c-ca72764f2ffd\") " Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.758619 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fda88c3-e0ac-4865-985c-ca72764f2ffd-utilities" (OuterVolumeSpecName: "utilities") pod "3fda88c3-e0ac-4865-985c-ca72764f2ffd" (UID: "3fda88c3-e0ac-4865-985c-ca72764f2ffd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.768103 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fda88c3-e0ac-4865-985c-ca72764f2ffd-kube-api-access-hgqst" (OuterVolumeSpecName: "kube-api-access-hgqst") pod "3fda88c3-e0ac-4865-985c-ca72764f2ffd" (UID: "3fda88c3-e0ac-4865-985c-ca72764f2ffd"). InnerVolumeSpecName "kube-api-access-hgqst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.780289 4792 scope.go:117] "RemoveContainer" containerID="e04a7bdc804e46b240e05ebc85432e2124c91751ae6615f1978d582a4e5e2d88" Mar 09 10:37:43 crc kubenswrapper[4792]: E0309 10:37:43.780736 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04a7bdc804e46b240e05ebc85432e2124c91751ae6615f1978d582a4e5e2d88\": container with ID starting with e04a7bdc804e46b240e05ebc85432e2124c91751ae6615f1978d582a4e5e2d88 not found: ID does not exist" containerID="e04a7bdc804e46b240e05ebc85432e2124c91751ae6615f1978d582a4e5e2d88" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.780763 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04a7bdc804e46b240e05ebc85432e2124c91751ae6615f1978d582a4e5e2d88"} err="failed to get container status \"e04a7bdc804e46b240e05ebc85432e2124c91751ae6615f1978d582a4e5e2d88\": rpc error: code = NotFound desc = could not find container \"e04a7bdc804e46b240e05ebc85432e2124c91751ae6615f1978d582a4e5e2d88\": container with ID starting with e04a7bdc804e46b240e05ebc85432e2124c91751ae6615f1978d582a4e5e2d88 not found: ID does not exist" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.780785 4792 scope.go:117] "RemoveContainer" containerID="a6ddf6abb67b833be83e0b9c8fe1d6a6e77d8f40b9bd74a0ca911cd0bc60e490" Mar 09 10:37:43 crc kubenswrapper[4792]: E0309 10:37:43.781013 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6ddf6abb67b833be83e0b9c8fe1d6a6e77d8f40b9bd74a0ca911cd0bc60e490\": container with ID starting with a6ddf6abb67b833be83e0b9c8fe1d6a6e77d8f40b9bd74a0ca911cd0bc60e490 not found: ID does not exist" containerID="a6ddf6abb67b833be83e0b9c8fe1d6a6e77d8f40b9bd74a0ca911cd0bc60e490" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.781039 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ddf6abb67b833be83e0b9c8fe1d6a6e77d8f40b9bd74a0ca911cd0bc60e490"} err="failed to get container status \"a6ddf6abb67b833be83e0b9c8fe1d6a6e77d8f40b9bd74a0ca911cd0bc60e490\": rpc error: code = NotFound desc = could not find container \"a6ddf6abb67b833be83e0b9c8fe1d6a6e77d8f40b9bd74a0ca911cd0bc60e490\": container with ID starting with a6ddf6abb67b833be83e0b9c8fe1d6a6e77d8f40b9bd74a0ca911cd0bc60e490 not found: ID does not exist" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.781056 4792 scope.go:117] "RemoveContainer" containerID="99d58a340b635cca2405b8523cd89103f26e30870a73391cb8597ce1ff714472" Mar 09 10:37:43 crc kubenswrapper[4792]: E0309 10:37:43.782397 4792 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d58a340b635cca2405b8523cd89103f26e30870a73391cb8597ce1ff714472\": container with ID starting with 99d58a340b635cca2405b8523cd89103f26e30870a73391cb8597ce1ff714472 not found: ID does not exist" containerID="99d58a340b635cca2405b8523cd89103f26e30870a73391cb8597ce1ff714472" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.782417 4792 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d58a340b635cca2405b8523cd89103f26e30870a73391cb8597ce1ff714472"} err="failed to get container status \"99d58a340b635cca2405b8523cd89103f26e30870a73391cb8597ce1ff714472\": rpc error: code = NotFound desc = could not find container \"99d58a340b635cca2405b8523cd89103f26e30870a73391cb8597ce1ff714472\": container with ID starting with 99d58a340b635cca2405b8523cd89103f26e30870a73391cb8597ce1ff714472 not found: ID does not exist" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.818330 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fda88c3-e0ac-4865-985c-ca72764f2ffd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fda88c3-e0ac-4865-985c-ca72764f2ffd" (UID: "3fda88c3-e0ac-4865-985c-ca72764f2ffd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.863252 4792 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fda88c3-e0ac-4865-985c-ca72764f2ffd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.863464 4792 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fda88c3-e0ac-4865-985c-ca72764f2ffd-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:37:43 crc kubenswrapper[4792]: I0309 10:37:43.863523 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgqst\" (UniqueName: \"kubernetes.io/projected/3fda88c3-e0ac-4865-985c-ca72764f2ffd-kube-api-access-hgqst\") on node \"crc\" DevicePath \"\"" Mar 09 10:37:44 crc kubenswrapper[4792]: I0309 10:37:44.020274 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7w48j"] Mar 09 10:37:44 crc kubenswrapper[4792]: I0309 10:37:44.028598 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7w48j"] Mar 09 10:37:45 crc kubenswrapper[4792]: I0309 10:37:45.675534 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fda88c3-e0ac-4865-985c-ca72764f2ffd" path="/var/lib/kubelet/pods/3fda88c3-e0ac-4865-985c-ca72764f2ffd/volumes" Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.152265 4792 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550878-kft55"] Mar 09 10:38:00 crc kubenswrapper[4792]: E0309 10:38:00.153344 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fda88c3-e0ac-4865-985c-ca72764f2ffd" containerName="extract-utilities" Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.153365 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fda88c3-e0ac-4865-985c-ca72764f2ffd" containerName="extract-utilities" Mar 09 10:38:00 crc kubenswrapper[4792]: E0309 10:38:00.153395 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fda88c3-e0ac-4865-985c-ca72764f2ffd" containerName="registry-server" Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.153403 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fda88c3-e0ac-4865-985c-ca72764f2ffd" containerName="registry-server" Mar 09 10:38:00 crc kubenswrapper[4792]: E0309 10:38:00.153426 4792 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fda88c3-e0ac-4865-985c-ca72764f2ffd" containerName="extract-content" Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.153434 4792 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fda88c3-e0ac-4865-985c-ca72764f2ffd" containerName="extract-content" Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.153958 4792 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fda88c3-e0ac-4865-985c-ca72764f2ffd" containerName="registry-server" Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.154637 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550878-kft55" Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.157701 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.157755 4792 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-fwclj" Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.157951 4792 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.162514 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550878-kft55"] Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.348967 4792 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kvxp\" (UniqueName: \"kubernetes.io/projected/050845be-021a-44a2-92ea-3e7dcbb6b95f-kube-api-access-2kvxp\") pod \"auto-csr-approver-29550878-kft55\" (UID: \"050845be-021a-44a2-92ea-3e7dcbb6b95f\") " pod="openshift-infra/auto-csr-approver-29550878-kft55" Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.451006 4792 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kvxp\" (UniqueName: \"kubernetes.io/projected/050845be-021a-44a2-92ea-3e7dcbb6b95f-kube-api-access-2kvxp\") pod \"auto-csr-approver-29550878-kft55\" (UID: \"050845be-021a-44a2-92ea-3e7dcbb6b95f\") " pod="openshift-infra/auto-csr-approver-29550878-kft55" Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.479490 4792 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kvxp\" (UniqueName: \"kubernetes.io/projected/050845be-021a-44a2-92ea-3e7dcbb6b95f-kube-api-access-2kvxp\") pod \"auto-csr-approver-29550878-kft55\" (UID: \"050845be-021a-44a2-92ea-3e7dcbb6b95f\") " pod="openshift-infra/auto-csr-approver-29550878-kft55" Mar 09 10:38:00 crc kubenswrapper[4792]: I0309 10:38:00.776781 4792 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550878-kft55" Mar 09 10:38:01 crc kubenswrapper[4792]: I0309 10:38:01.295687 4792 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550878-kft55"] Mar 09 10:38:01 crc kubenswrapper[4792]: I0309 10:38:01.844373 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550878-kft55" event={"ID":"050845be-021a-44a2-92ea-3e7dcbb6b95f","Type":"ContainerStarted","Data":"8793c89f4d3ef8aadbb614c18c853108168d14c59440116bc8e0b6164dfd45d6"} Mar 09 10:38:02 crc kubenswrapper[4792]: I0309 10:38:02.855410 4792 generic.go:334] "Generic (PLEG): container finished" podID="050845be-021a-44a2-92ea-3e7dcbb6b95f" containerID="335d9c89311fadea98c88473367cd0c8fc38075e92a36227b31c54a9205fcfee" exitCode=0 Mar 09 10:38:02 crc kubenswrapper[4792]: I0309 10:38:02.855503 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550878-kft55" event={"ID":"050845be-021a-44a2-92ea-3e7dcbb6b95f","Type":"ContainerDied","Data":"335d9c89311fadea98c88473367cd0c8fc38075e92a36227b31c54a9205fcfee"} Mar 09 10:38:04 crc kubenswrapper[4792]: I0309 10:38:04.204830 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550878-kft55" Mar 09 10:38:04 crc kubenswrapper[4792]: I0309 10:38:04.366913 4792 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kvxp\" (UniqueName: \"kubernetes.io/projected/050845be-021a-44a2-92ea-3e7dcbb6b95f-kube-api-access-2kvxp\") pod \"050845be-021a-44a2-92ea-3e7dcbb6b95f\" (UID: \"050845be-021a-44a2-92ea-3e7dcbb6b95f\") " Mar 09 10:38:04 crc kubenswrapper[4792]: I0309 10:38:04.373436 4792 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/050845be-021a-44a2-92ea-3e7dcbb6b95f-kube-api-access-2kvxp" (OuterVolumeSpecName: "kube-api-access-2kvxp") pod "050845be-021a-44a2-92ea-3e7dcbb6b95f" (UID: "050845be-021a-44a2-92ea-3e7dcbb6b95f"). InnerVolumeSpecName "kube-api-access-2kvxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:38:04 crc kubenswrapper[4792]: I0309 10:38:04.468841 4792 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kvxp\" (UniqueName: \"kubernetes.io/projected/050845be-021a-44a2-92ea-3e7dcbb6b95f-kube-api-access-2kvxp\") on node \"crc\" DevicePath \"\"" Mar 09 10:38:04 crc kubenswrapper[4792]: I0309 10:38:04.873824 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550878-kft55" event={"ID":"050845be-021a-44a2-92ea-3e7dcbb6b95f","Type":"ContainerDied","Data":"8793c89f4d3ef8aadbb614c18c853108168d14c59440116bc8e0b6164dfd45d6"} Mar 09 10:38:04 crc kubenswrapper[4792]: I0309 10:38:04.874171 4792 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8793c89f4d3ef8aadbb614c18c853108168d14c59440116bc8e0b6164dfd45d6" Mar 09 10:38:04 crc kubenswrapper[4792]: I0309 10:38:04.873866 4792 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550878-kft55" Mar 09 10:38:05 crc kubenswrapper[4792]: I0309 10:38:05.319346 4792 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550872-jsdc5"] Mar 09 10:38:05 crc kubenswrapper[4792]: I0309 10:38:05.336197 4792 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550872-jsdc5"] Mar 09 10:38:05 crc kubenswrapper[4792]: I0309 10:38:05.682143 4792 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ee8f8b-c971-47a8-ba9f-796809c0b326" path="/var/lib/kubelet/pods/a4ee8f8b-c971-47a8-ba9f-796809c0b326/volumes" Mar 09 10:38:13 crc kubenswrapper[4792]: I0309 10:38:13.213834 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:38:13 crc kubenswrapper[4792]: I0309 10:38:13.214391 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:38:43 crc kubenswrapper[4792]: I0309 10:38:43.214451 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:38:43 crc kubenswrapper[4792]: I0309 10:38:43.214975 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:38:46 crc kubenswrapper[4792]: I0309 10:38:46.337295 4792 scope.go:117] "RemoveContainer" containerID="936ce09d6ba3bc159fa27d63b41d82b1eb122725dd3932855fd75aa26ca13c86" Mar 09 10:39:13 crc kubenswrapper[4792]: I0309 10:39:13.214355 4792 patch_prober.go:28] interesting pod/machine-config-daemon-97tth container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:39:13 crc kubenswrapper[4792]: I0309 10:39:13.214927 4792 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:39:13 crc kubenswrapper[4792]: I0309 10:39:13.214968 4792 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-97tth" Mar 09 10:39:13 crc kubenswrapper[4792]: I0309 10:39:13.215820 4792 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c21502e8c989c9247d71918c93d06343710bcaa6779aed11b6fc4017e2ad0b4"} pod="openshift-machine-config-operator/machine-config-daemon-97tth" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 10:39:13 crc kubenswrapper[4792]: I0309 10:39:13.215875 4792 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-97tth" podUID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerName="machine-config-daemon" containerID="cri-o://2c21502e8c989c9247d71918c93d06343710bcaa6779aed11b6fc4017e2ad0b4" gracePeriod=600 Mar 09 10:39:13 crc kubenswrapper[4792]: I0309 10:39:13.427784 4792 generic.go:334] "Generic (PLEG): container finished" podID="bd11045a-d746-4b42-872c-8b8d1dd2d515" containerID="2c21502e8c989c9247d71918c93d06343710bcaa6779aed11b6fc4017e2ad0b4" exitCode=0 Mar 09 10:39:13 crc kubenswrapper[4792]: I0309 10:39:13.428103 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerDied","Data":"2c21502e8c989c9247d71918c93d06343710bcaa6779aed11b6fc4017e2ad0b4"} Mar 09 10:39:13 crc kubenswrapper[4792]: I0309 10:39:13.428142 4792 scope.go:117] "RemoveContainer" containerID="e37eff5330f6637c46a22bed33527979089264c39088746b21ae17216f78512e" Mar 09 10:39:14 crc kubenswrapper[4792]: I0309 10:39:14.438717 4792 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-97tth" event={"ID":"bd11045a-d746-4b42-872c-8b8d1dd2d515","Type":"ContainerStarted","Data":"f0588409d4dce7365fa621cf259498387b7577a81f291a6423495aa75de526a6"}